File size: 355 Bytes
056e037 |
1 2 3 4 5 6 |
# Sheared LLaMA on Deita-6K
This is the supervised fine-tuned version of [Sheared LLaMA 2.7B](https://huggingface.co/princeton-nlp/Sheared-LLaMA-2.7B) on [Deita-6k](https://huggingface.co/datasets/hkust-nlp/deita-6k-v0) for 2 epochs.
This model is for fair comparison with [LLaMA-MoE-v1-sft](https://huggingface.co/llama-moe/LLaMA-MoE-v1-3_5B-2_8-sft).
|