ChuckMcSneed commited on
Commit
4447986
·
verified ·
1 Parent(s): 8be6163

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +2 -2
README.md CHANGED
@@ -179,14 +179,14 @@ Then I SLERP-merged it with cognitivecomputations/dolphin-2.2-70b (Needed to bri
179
  | P | 5.25 | 5.25 |
180
  | Total | 19.75 | 19 |
181
 
182
- ### Open LLM leaderboard
183
  [Leaderboard on Huggingface](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
184
  |Model |Average|ARC |HellaSwag|MMLU |TruthfulQA|Winogrande|GSM8K|
185
  |--------------|-------|-----|---------|-----|----------|----------|-----|
186
  |Gembo-v1-70b |70.51 |71.25|86.98 |70.85|63.25 |80.51 |50.19|
187
  |Gembo-v1.1-70b|70.35 |70.99|86.9 |70.63|62.45 |80.51 |50.64|
188
 
189
- # [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
190
  Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__Gembo-v1.1-70b)
191
 
192
  | Metric |Value|
 
179
  | P | 5.25 | 5.25 |
180
  | Total | 19.75 | 19 |
181
 
182
+ ### [Open LLM Leaderboard Evaluation Results](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
183
  [Leaderboard on Huggingface](https://huggingface.co/spaces/HuggingFaceH4/open_llm_leaderboard)
184
  |Model |Average|ARC |HellaSwag|MMLU |TruthfulQA|Winogrande|GSM8K|
185
  |--------------|-------|-----|---------|-----|----------|----------|-----|
186
  |Gembo-v1-70b |70.51 |71.25|86.98 |70.85|63.25 |80.51 |50.19|
187
  |Gembo-v1.1-70b|70.35 |70.99|86.9 |70.63|62.45 |80.51 |50.64|
188
 
189
+
190
  Detailed results can be found [here](https://huggingface.co/datasets/open-llm-leaderboard/details_ChuckMcSneed__Gembo-v1.1-70b)
191
 
192
  | Metric |Value|