--- base_model: - mukaj/Llama-3.1-Hawkish-8B - unsloth/Meta-Llama-3.1-8B - T145/KRONOS-8B-V1-P3 - T145/KRONOS-8B-V1-P1 - unsloth/Meta-Llama-3.1-8B-Instruct library_name: transformers tags: - mergekit - merge --- # Untitled Model (1) This is a merge of pre-trained language models created using [mergekit](https://github.com/cg123/mergekit). ## Merge Details ### Merge Method This model was merged using the [TIES](https://arxiv.org/abs/2306.01708) merge method using [unsloth/Meta-Llama-3.1-8B](https://huggingface.co/unsloth/Meta-Llama-3.1-8B) as a base. ### Models Merged The following models were included in the merge: * [mukaj/Llama-3.1-Hawkish-8B](https://huggingface.co/mukaj/Llama-3.1-Hawkish-8B) * [T145/KRONOS-8B-V1-P3](https://huggingface.co/T145/KRONOS-8B-V1-P3) * [T145/KRONOS-8B-V1-P1](https://huggingface.co/T145/KRONOS-8B-V1-P1) * [unsloth/Meta-Llama-3.1-8B-Instruct](https://huggingface.co/unsloth/Meta-Llama-3.1-8B-Instruct) ### Configuration The following YAML configuration was used to produce this model: ```yaml base_model: unsloth/Meta-Llama-3.1-8B dtype: bfloat16 merge_method: ties parameters: density: 1.0 weight: 1.0 slices: - sources: - layer_range: [0, 32] model: T145/KRONOS-8B-V1-P1 parameters: density: 1.0 weight: 1.0 - layer_range: [0, 32] model: T145/KRONOS-8B-V1-P3 parameters: density: 1.0 weight: 1.0 - layer_range: [0, 32] model: mukaj/Llama-3.1-Hawkish-8B parameters: density: 1.0 weight: 1.0 - layer_range: [0, 32] model: unsloth/Meta-Llama-3.1-8B-Instruct parameters: density: 1.0 weight: 1.0 - layer_range: [0, 32] model: unsloth/Meta-Llama-3.1-8B tokenizer_source: unsloth/Meta-Llama-3.1-8B-Instruct ```