metadata
license: apache-2.0
language:
- ko
library_name: transformers
pipeline_tag: text-generation
datasets:
- maywell/ko_Ultrafeedback_binarized
Explanation
- With the base model, applied DPO to the small amount of layers with the open dataset , saved just the adapter part
- Merged the base model and the tuned adapter together
Base Model
Used Corpus
Score
Average | Ko-ARC | Ko-HellaSwag | Ko-MMLU | Ko-TruthfulQA | Ko-CommonGen V2 |
---|---|---|---|---|---|
52.83 | 50 | 60.55 | 48.8 | 71.51 | 43.65 |
Log
- 2024.01.25: Initial version Upload
- 2024.02.10: Readme updated
- 2024.02.11: Score updated
LICENSE
- Apache 2.0
Citation
- beomi/OPEN-SOLAR-KO-10.7B
@misc {solar_ko_junbum_2023, author = { {L. Junbum} }, title = { Solar-Ko-10.7b }, year = 2024, url = { https://huggingface.co/beomi/SOLAR-KO-10.7B }, publisher = { Hugging Face } }