File size: 690 Bytes
3d1dca3 |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 |
---
library_name: transformers
language:
- en
pipeline_tag: text-generation
tags:
- nlp
- code
- microsoft
---
Official [AQLM](https://arxiv.org/abs/2401.06118) quantization of [microsoft/Phi-3-mini-128k-instruct
](https://huggingface.co/microsoft/Phi-3-mini-128k-instruct).
For this quantization, we used 1 codebook of 16 bits.
Results:
| Model | Quantization | MMLU (5-shot) | ArcC| ArcE| Hellaswag | Winogrande | PiQA | Model size, Gb |
|------|------|-------|------|------|------|------|------|------|
| microsoft/Phi-3-mini-128k-instruct| None | 0.6881 | 0.5418 | 0.8127 | 0.5980 | 0.7873 | 0.7340 | 7.6 |
| | 1x16 | 0.5815 | 0.4599 | 0.7845 | 0.5235 | 0.7666 | 0.6930 | 1.4 | |