Update README.md
Browse files
README.md
CHANGED
@@ -8,7 +8,7 @@ tags:
|
|
8 |
|
9 |
![image/png](https://i.ibb.co/MRXkh6p/icon2.png)
|
10 |
|
11 |
-
Test merge. Attempt to get good at RP, ERP, general things model with 128k context. Every model here has `Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context` in merge instead of regular MistralYarn 128k. The reason is because i belive Epiculous merged it with Mistral Instruct v0.2 to make first 32k context expereance as perfect as possible until we reach YaRN from 32 to 128k, if not - it's sad D
|
12 |
|
13 |
|
14 |
Here is the "family tree" of this model, im not writing full model names cause they long af
|
|
|
8 |
|
9 |
![image/png](https://i.ibb.co/MRXkh6p/icon2.png)
|
10 |
|
11 |
+
Test merge. Attempt to get good at RP, ERP, general things model with 128k context. Every model here has `Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context` in merge instead of regular MistralYarn 128k. The reason is because i belive Epiculous merged it with Mistral Instruct v0.2 to make first 32k context expereance as perfect as possible until we reach YaRN from 32 to 128k, if not - it's sad D:, or, i get something wrong.
|
12 |
|
13 |
|
14 |
Here is the "family tree" of this model, im not writing full model names cause they long af
|