louisbrulenaudet
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -13,6 +13,9 @@ license: apache-2.0
|
|
13 |
language:
|
14 |
- en
|
15 |
library_name: transformers
|
|
|
|
|
|
|
16 |
---
|
17 |
<center><img src='https://i.imgur.com/0xFTuAX.png' width='450px'></center>
|
18 |
|
@@ -34,6 +37,19 @@ The implementation of SLERP involves the following steps:
|
|
34 |
|
35 |
In essence, SLERP provides a robust mechanism for interpolating vectors, offering advantages in preserving directional information and mitigating issues associated with linear interpolation in high-dimensional spaces.
|
36 |
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
37 |
## Configuration
|
38 |
|
39 |
```yaml
|
|
|
13 |
language:
|
14 |
- en
|
15 |
library_name: transformers
|
16 |
+
metrics:
|
17 |
+
- accuracy
|
18 |
+
pipeline_tag: text-generation
|
19 |
---
|
20 |
<center><img src='https://i.imgur.com/0xFTuAX.png' width='450px'></center>
|
21 |
|
|
|
37 |
|
38 |
In essence, SLERP provides a robust mechanism for interpolating vectors, offering advantages in preserving directional information and mitigating issues associated with linear interpolation in high-dimensional spaces.
|
39 |
|
40 |
+
## Evaluation
|
41 |
+
|
42 |
+
The evaluation was performed using the HuggingFace Open LLM Leaderboard.
|
43 |
+
|
44 |
+
| Model | Average | ARC | HellaSwag | MMLU | TruthfulQA | Winogrande | GSM8K | #Params (B) |
|
45 |
+
|-------------------------------------------|------------|-------|-----------|-------|------------|------------|-------|--------------|
|
46 |
+
| **louisbrulenaudet/Pearl-7B-slerp** |**72.75** | 68.00 | 87.16 | 64.04 | 62.35 | 81.29 |**73.62**| 7.24 |
|
47 |
+
| mistralai/Mixtral-8x7B-Instruct-v0.1 | 72.62 | 70.22 | 87.63 | 71.16 | 64.58 | 81.37 | 60.73 | 46.7 |
|
48 |
+
| microsoft/phi-2 | 61.33 | 61.09 | 75.11 | 58.11 | 44.47 | 74.35 | 54.81 | 2.78 |
|
49 |
+
| microsoft/Orca-2-13b | 58.64 | 60.67 | 79.81 | 60.37 | 56.41 | 76.64 | 17.97 | 13 |
|
50 |
+
| mistralai/Mistral-7B-Instruct-v0.1 | 54.96 | 54.52 | 75.63 | 55.38 | 56.28 | 73.72 | 14.25 | 7.24 |
|
51 |
+
| meta-llama/Llama-2-7b-hf | 50.97 | 53.07 | 78.59 | 46.87 | 38.76 | 74.03 | 14.48 | 6.74 |
|
52 |
+
|
53 |
## Configuration
|
54 |
|
55 |
```yaml
|