license: apache-2.0 | |
tags: | |
- merge | |
- mergekit | |
- lazymergekit | |
- Locutusque/Hercules-6.1-Llama-3.1-8B | |
- Sao10K/Llama-3.1-8B-Stheno-v3.4 | |
# ZeroXClem/Stheno-Hercules-3.1-8B | |
ZeroXClem/Stheno-Hercules-3.1-8B is a merge of the following models using [mergekit](https://github.com/cg123/mergekit): | |
* [Locutusque/Hercules-6.1-Llama-3.1-8B](https://huggingface.co/Locutusque/Hercules-6.1-Llama-3.1-8B) | |
* [Sao10K/Llama-3.1-8B-Stheno-v3.4](https://huggingface.co/Sao10K/Llama-3.1-8B-Stheno-v3.4) | |
## 🧩 Configuration | |
```yaml | |
slices: | |
- sources: | |
- model: Locutusque/Hercules-6.1-Llama-3.1-8B | |
layer_range: [0, 32] | |
- model: Sao10K/Llama-3.1-8B-Stheno-v3.4 | |
layer_range: [0, 32] | |
merge_method: slerp | |
base_model: Locutusque/Hercules-6.1-Llama-3.1-8B | |
parameters: | |
t: | |
- filter: self_attn | |
value: [0, 0.5, 0.3, 0.7, 1] | |
- filter: mlp | |
value: [1, 0.5, 0.7, 0.3, 0] | |
- value: 0.5 | |
dtype: bfloat16 | |
``` |