Update README.md
Browse files
README.md
CHANGED
@@ -1,4 +1,12 @@
|
|
1 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
2 |
|
3 |
This model was created with the goal for a good llama 3 uncencored model with long context.
|
4 |
At it worked like a charm.
|
@@ -10,7 +18,7 @@ Uses llama 3 context
|
|
10 |
|
11 |
Sampler wise it has a very wide optimal so works with lots of different settings.
|
12 |
|
13 |
-
Thanks to the people who train the custom models
|
14 |
Undi
|
15 |
IkariDev
|
16 |
For Lumimaid.
|
@@ -65,4 +73,4 @@ models:
|
|
65 |
merge_method: breadcrumbs_ties
|
66 |
base_model: I:\Llama-3-70B-Instruct-Gradient-262k
|
67 |
dtype: bfloat16
|
68 |
-
```
|
|
|
1 |
+
---
|
2 |
+
tags:
|
3 |
+
- uncencored
|
4 |
+
- llama-3
|
5 |
+
- tess
|
6 |
+
- lumimaid
|
7 |
+
- Lumi-tess
|
8 |
+
---
|
9 |
+
Lumi-tess
|
10 |
|
11 |
This model was created with the goal for a good llama 3 uncencored model with long context.
|
12 |
At it worked like a charm.
|
|
|
18 |
|
19 |
Sampler wise it has a very wide optimal so works with lots of different settings.
|
20 |
|
21 |
+
Thanks to the people who train the custom models:
|
22 |
Undi
|
23 |
IkariDev
|
24 |
For Lumimaid.
|
|
|
73 |
merge_method: breadcrumbs_ties
|
74 |
base_model: I:\Llama-3-70B-Instruct-Gradient-262k
|
75 |
dtype: bfloat16
|
76 |
+
```
|