Theros's picture
Upload folder using huggingface_hub
3dc9b2e verified
---
base_model:
- terrycraddock/Reflection-Llama-3.1-8B
- cgato/L3-TheSpice-8b-v0.8.3
- SvalTek/L3-ColdBrew-SpicyReflect
- Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
library_name: transformers
tags:
- mergekit
- merge
---
# merge
This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit).
## Merge Details
### Merge Method
This model was merged using the [DARE](https://arxiv.org/abs/2311.03099) [TIES](https://arxiv.org/abs/2306.01708) merge method using [SvalTek/L3-ColdBrew-SpicyReflect](https://huggingface.co/SvalTek/L3-ColdBrew-SpicyReflect) as a base.
### Models Merged
The following models were included in the merge:
* [terrycraddock/Reflection-Llama-3.1-8B](https://huggingface.co/terrycraddock/Reflection-Llama-3.1-8B)
* [cgato/L3-TheSpice-8b-v0.8.3](https://huggingface.co/cgato/L3-TheSpice-8b-v0.8.3)
* [Nitral-AI/Hathor_Tahsin-L3-8B-v0.85](https://huggingface.co/Nitral-AI/Hathor_Tahsin-L3-8B-v0.85)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
models:
- model: SvalTek/L3-ColdBrew-SpicyReflect
parameters:
density: 0.5
weight: 0.6
- model: cgato/L3-TheSpice-8b-v0.8.3
parameters:
density: 0.5
weight: 0.4
- model: terrycraddock/Reflection-Llama-3.1-8B
parameters:
density: 0.5
weight: 0.4
- model: Nitral-AI/Hathor_Tahsin-L3-8B-v0.85
parameters:
density: 0.5
weight: 0.3
merge_method: dare_ties
base_model: SvalTek/L3-ColdBrew-SpicyReflect
parameters:
normalize: false
int8_mask: true
dtype: float16
```