|
--- |
|
license: llama3.1 |
|
--- |
|
| | Wiki | C4 | PIQA | ARC-E | ARC-C | HellaSwag | Wino | Avg. | |
|
| ----------- | ---- | ----- | ----- | ----- | ----- | --------- | ----- | ----- | |
|
| Unquantized | 2.82 | 7.18 | 82.81 | 85.31 | 59.64 | 67.49 | 82.00 | 75.45 | |
|
| W4G64 | TBA | TBA | TBA | TBA | TBA | TBA | TBA | TBA | |
|
| W3G64 | TBA | TBA | TBA | TBA | TBA | TBA | TBA | TBA | |
|
|
|
Revisions available in this repository: |
|
- `main` (W4G64, scales learned); |
|
- `nfl_w3g64` (W3G64, scales learned); |
|
|
|
Evaluations are provided for models with learned scales.<br>Benchmark scores (zero-shot) are computed with [`lm-evaluation-harness`](https://github.com/EleutherAI/lm-evaluation-harness). |