File size: 1,147 Bytes
c2e1ec0
 
 
 
e004477
 
75e2ff8
 
c2e1ec0
94acda3
 
 
 
c2e1ec0
0ac84e0
ab6f7b1
db4bf37
94acda3
ab6f7b1
99429e0
313bf88
 
 
 
 
ab6f7b1
0ac84e0
94acda3
313bf88
99429e0
0ac84e0
 
99429e0
0ac84e0
ab6f7b1
 
 
 
 
 
 
 
 
 
f2cc84e
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
---
license: apache-2.0
language:
- ko
library_name: transformers
pipeline_tag: text-generation
datasets:
- maywell/ko_Ultrafeedback_binarized
---

**Explanation**
- With the base model, applied DPO to the small amount of layers with the open dataset , saved just the adapter part
- Merged the base model and the tuned adapter together

**Base Model**
- [beomi/OPEN-SOLAR-KO-10.7B](https://huggingface.co/beomi/OPEN-SOLAR-KO-10.7B)

**Used Corpus**
- [maywell/ko_Ultrafeedback_binarized](https://huggingface.co/datasets/maywell/ko_Ultrafeedback_binarized)

**Score**
|Average|Ko-ARC|Ko-HellaSwag|Ko-MMLU|Ko-TruthfulQA|Ko-CommonGen V2|
|:---:|:---:|:---:|:---:|:---:|:---:|
|52.83|50|60.55|48.8|71.51|43.65|61.16|

**Log**
- 2024.01.25: Initial version Upload
- 2024.02.10: Readme updated
- 2024.02.11: Score updated

**LICENSE**
- Apache 2.0

**Citation**
- beomi/OPEN-SOLAR-KO-10.7B
  ```
  @misc {solar_ko_junbum_2023,
      author       = { {L. Junbum} },
      title        = { Solar-Ko-10.7b },
      year         = 2024,
      url          = { https://huggingface.co/beomi/SOLAR-KO-10.7B },
      publisher    = { Hugging Face }
  }
  
  ```