YAML Metadata
Warning:
empty or missing yaml metadata in repo card
(https://huggingface.co/docs/hub/model-cards#model-card-metadata)
Quantization made by Richard Erkhov.
gemma-2b-translation-v0.92 - AWQ
- Model creator: https://huggingface.co/lemon-mint/
- Original model: https://huggingface.co/lemon-mint/gemma-2b-translation-v0.92/
Original model description:
library_name: transformers language: - ko license: gemma tags: - gemma - pytorch - instruct - finetune - translation widget: - messages: - role: user content: "Translate into Korean.\nEnglish:\n\nHamsters don't eat cats." inference: parameters: max_new_tokens: 2048 base_model: google/gemma-1.1-2b-it pipeline_tag: text-generation
Gemma 2B Translation v0.92
- Eval Loss:
0.9056
- Train Loss:
0.7346
- lr:
5e-5
- optimizer: adamw
- lr_scheduler_type: cosine
Prompt Template
<bos><start_of_turn>user
Translate into Korean.
English:
Hamsters don't eat cats.<end_of_turn>
<start_of_turn>model
ํ์คํฐ๋ ๊ณ ์์ด๋ฅผ ๋จน์ง ์์ต๋๋ค.<eos>
Model Description
- Developed by:
lemon-mint
- Model type: Gemma
- Language(s) (NLP): English
- License: gemma-terms-of-use
- Finetuned from model: google/gemma-1.1-2b-it
- Downloads last month
- 6