Update README.md
Browse files
README.md
CHANGED
@@ -29,8 +29,6 @@ This is a merge of pre-trained language models.
|
|
29 |
|
30 |
This model is on the Llama-3 arch with Llama-3.1 merged in, so it has 8k context length. But could possibly be extended slightly with RoPE due to the L3.1 layers.
|
31 |
|
32 |
-
There is a retrofitted [L3.1 version](https://huggingface.co/x0000001/L3.1-Umbral-Storm-8B-t0.0001?not-for-all-audiences=true) aswell, but performance is unknown.
|
33 |
-
|
34 |
### Merge Method
|
35 |
|
36 |
This model was merged using the <b>NEARSWAP t0.0001</b> merge algorithm.
|
|
|
29 |
|
30 |
This model is on the Llama-3 arch with Llama-3.1 merged in, so it has 8k context length. But could possibly be extended slightly with RoPE due to the L3.1 layers.
|
31 |
|
|
|
|
|
32 |
### Merge Method
|
33 |
|
34 |
This model was merged using the <b>NEARSWAP t0.0001</b> merge algorithm.
|