Tiny-ll / README.md
wassemgtk's picture
Update README.md
d57ffd6 verified
|
raw
history blame
1.59 kB
---
base_model:
- mlabonne/Meta-Llama-3-225B-Instruct
library_name: transformers
tags:
-
This model was merged using the passthrough merge method.
### Models Merged
The following models were included in the merge:
* [mlabonne/Meta-Llama-3-225B-Instruct](https://huggingface.co/mlabonne/Meta-Llama-3-225B-Instruct)
### Configuration
The following YAML configuration was used to produce this model:
```yaml
slices:
- sources:
- layer_range: [0, 20]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [10, 30]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [20, 40]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [30, 50]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [40, 60]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [50, 70]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [60, 80]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [70, 90]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [80, 100]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [90, 110]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [100, 120]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [110, 130]
model: mlabonne/Meta-Llama-3-225B-Instruct
- sources:
- layer_range: [120, 140]
model: mlabonne/Meta-Llama-3-225B-Instruct
merge_method: passthrough
dtype: float16
```