tzem λͺ¨λΈμ instruct λ°μ΄ν°λ‘ νμΈνλ ν λͺ¨λΈμ λλ€.
ν둬ννΈ ν νλ¦Ώ
**μ¬μ©μ:** {prompt}
**μΈκ³΅μ§λ₯:**
μ¬μ©
from transformers import AutoModelForCausalLM, AutoTokenizer
model_id = "blueapple8259/tzem-instruct"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)
prompt = "μ¬κΈ°μ ν둬ννΈ μ
λ ₯"
text = f"**μ¬μ©μ:** {prompt}\n**μΈκ³΅μ§λ₯:"
inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(
**inputs,
max_new_tokens=100,
pad_token_id=tokenizer.pad_token_id,
eos_token_id=tokenizer.eos_token_id,
do_sample=True,
top_p=0.2,
)
output = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(output)
μμ
**μ¬μ©μ:** μΈν°λ· λΈλΌμ°μ μ λν΄ μλ €μ€.
**μΈκ³΅μ§λ₯:** μΈν°λ· λΈλΌμ°μ λ μΈν°λ·μμ μ 보λ₯Ό κ²μνκ³ , λ€λ₯Έ μ¬μ©μμ μν΅νλ λ° μ¬μ©λλ μννΈμ¨μ΄μ
λλ€.
**μ¬μ©μ:** 건κ°μ μ μ§νκΈ° μν μΈ κ°μ§ νμ μλ €μ£ΌμΈμ.
**μΈκ³΅μ§λ₯:** 1. μΆ©λΆν μλ©΄μ μ·¨νμΈμ.
2. 건κ°ν μλ¨μ μμ·¨νμΈμ.
3. κ·μΉμ μΌλ‘ μ΄λνμΈμ.
λ°μ΄ν°μ
KoAlpaca - μ½λ, νκ° ν¬ν¨λ λ°μ΄ν° μ μΈ
- Downloads last month
- 6
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.