|
--- |
|
license: apache-2.0 |
|
language: |
|
- en |
|
tags: |
|
- merge |
|
--- |
|
|
|
![image/png](https://i.ibb.co/Qr4BYgc/1.png) |
|
|
|
Test merge. Attempt to get good at RP, ERP, general things model with 128k context. Every model here has `Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context` in merge instead of regular MistralYarn 128k. The reason is because i belive Epiculous merged it with Mistral Instruct v0.2 to make first 32k context as good as possible, if not than it's sad. |
|
|
|
|
|
Here is the "family tree" of this model, im not writing full model names cause they long af |
|
### NeuralKunoichi-EroSumika 4x7B |
|
``` |
|
* NeuralKunoichi-EroSumika 4x7B |
|
*(1) Kunocchini-7b-128k |
|
| |
|
*(2) Mistral-Instruct-v0.2-128k |
|
* Mistral-7B-Instruct-v0.2 |
|
| |
|
* Fett-128k |
|
| |
|
*(3) Erosumika-128k |
|
* FErosumika 7B |
|
| |
|
* FFett-128k |
|
| |
|
*(4) Mistral-NeuralHuman-128k |
|
* Fett-128k |
|
| |
|
* Mistral-NeuralHuman |
|
* Mistral_MoreHuman |
|
| |
|
* Mistral-Neural-Story |
|
``` |