xxx777xxxASD
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -6,9 +6,9 @@ tags:
|
|
6 |
- merge
|
7 |
---
|
8 |
|
9 |
-
![image/png](https://i.ibb.co/
|
10 |
|
11 |
-
Test merge. Attempt to get good at RP, ERP, general things model with 128k context. Every model here has `Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context` in merge instead of regular MistralYarn 128k. The reason is because i belive Epiculous merged it with Mistral Instruct v0.2 to make first 32k context as
|
12 |
|
13 |
|
14 |
Here is the "family tree" of this model, im not writing full model names cause they long af
|
|
|
6 |
- merge
|
7 |
---
|
8 |
|
9 |
+
![image/png](https://i.ibb.co/MRXkh6p/icon2.png)
|
10 |
|
11 |
+
Test merge. Attempt to get good at RP, ERP, general things model with 128k context. Every model here has `Epiculous/Fett-uccine-Long-Noodle-7B-120k-Context` in merge instead of regular MistralYarn 128k. The reason is because i belive Epiculous merged it with Mistral Instruct v0.2 to make first 32k context as perfect as possible until YaRN scaling would start from 32k, if not than it's sad.
|
12 |
|
13 |
|
14 |
Here is the "family tree" of this model, im not writing full model names cause they long af
|