license: other | |
This contains the weights for the LLaMA-30b model. This model is under a non-commercial license (see the LICENSE file). | |
You should only use this repository if you have been granted access to the model by filling out [this form](https://docs.google.com/forms/d/e/1FAIpQLSfqNECQnMkycAp2jP4Z9TFX0cGR4uf7b_fBxjY_OjhJILlKGA/viewform?usp=send_form) but either lost your copy of the weights or got some trouble converting them to the Transformers format. | |
# [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard) | |
Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_huggyllama__llama-30b) | |
| Metric | Value | | |
|-----------------------|---------------------------| | |
| Avg. | 49.73 | | |
| ARC (25-shot) | 61.43 | | |
| HellaSwag (10-shot) | 84.73 | | |
| MMLU (5-shot) | 58.45 | | |
| TruthfulQA (0-shot) | 42.27 | | |
| Winogrande (5-shot) | 80.03 | | |
| GSM8K (5-shot) | 14.86 | | |
| DROP (3-shot) | 6.33 | | |