YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

Quantization made by Richard Erkhov.

Github

Discord

Request more models

gemma-2b-translation-v0.92 - AWQ

Original model description:

library_name: transformers language: - ko license: gemma tags: - gemma - pytorch - instruct - finetune - translation widget: - messages: - role: user content: "Translate into Korean.\nEnglish:\n\nHamsters don't eat cats." inference: parameters: max_new_tokens: 2048 base_model: google/gemma-1.1-2b-it pipeline_tag: text-generation

Gemma 2B Translation v0.92

  • Eval Loss: 0.9056
  • Train Loss: 0.7346
  • lr: 5e-5
  • optimizer: adamw
  • lr_scheduler_type: cosine

Prompt Template

<bos><start_of_turn>user
Translate into Korean.
English:

Hamsters don't eat cats.<end_of_turn>
<start_of_turn>model
ํ–„์Šคํ„ฐ๋Š” ๊ณ ์–‘์ด๋ฅผ ๋จน์ง€ ์•Š์Šต๋‹ˆ๋‹ค.<eos>

Model Description

Downloads last month
6
Safetensors
Model size
790M params
Tensor type
I32
ยท
FP16
ยท
Inference API
Unable to determine this model's library. Check the docs .