fix eval results: remove drop etc.
Browse files
README.md
CHANGED
@@ -69,11 +69,10 @@ Detailed results can be found [here](https://huggingface.co/datasets/open-llm-le
|
|
69 |
|
70 |
| Metric | Value |
|
71 |
|-----------------------|---------------------------|
|
72 |
-
| Avg. |
|
73 |
| ARC (25-shot) | 66.55 |
|
74 |
| HellaSwag (10-shot) | 84.47 |
|
75 |
| MMLU (5-shot) | 63.34 |
|
76 |
| TruthfulQA (0-shot) | 61.22 |
|
77 |
| Winogrande (5-shot) | 78.37 |
|
78 |
-
| GSM8K (5-shot) |
|
79 |
-
| DROP (3-shot) | 32.66 |
|
|
|
69 |
|
70 |
| Metric | Value |
|
71 |
|-----------------------|---------------------------|
|
72 |
+
| Avg. | 67.84 |
|
73 |
| ARC (25-shot) | 66.55 |
|
74 |
| HellaSwag (10-shot) | 84.47 |
|
75 |
| MMLU (5-shot) | 63.34 |
|
76 |
| TruthfulQA (0-shot) | 61.22 |
|
77 |
| Winogrande (5-shot) | 78.37 |
|
78 |
+
| GSM8K (5-shot) | 53.07 |
|
|