tzem λͺ¨λΈμ„ instruct λ°μ΄ν„°λ‘œ νŒŒμΈνŠœλ‹ ν•œ λͺ¨λΈμž…λ‹ˆλ‹€.

ν”„λ‘¬ν”„νŠΈ ν…œν”Œλ¦Ώ

**μ‚¬μš©μž:** {prompt}
**인곡지λŠ₯:**

μ‚¬μš©

from transformers import AutoModelForCausalLM, AutoTokenizer

model_id = "blueapple8259/tzem-instruct"
tokenizer = AutoTokenizer.from_pretrained(model_id)
model = AutoModelForCausalLM.from_pretrained(model_id)

prompt = "여기에 ν”„λ‘¬ν”„νŠΈ μž…λ ₯"

text = f"**μ‚¬μš©μž:** {prompt}\n**인곡지λŠ₯:"

inputs = tokenizer(text, return_tensors="pt")
outputs = model.generate(
    **inputs,
    max_new_tokens=100,
    pad_token_id=tokenizer.pad_token_id,
    eos_token_id=tokenizer.eos_token_id,
    do_sample=True,
    top_p=0.2,
)

output = tokenizer.decode(outputs[0], skip_special_tokens=True)
print(output)

μ˜ˆμ‹œ

**μ‚¬μš©μž:** 인터넷 λΈŒλΌμš°μ €μ— λŒ€ν•΄ μ•Œλ €μ€˜.
**인곡지λŠ₯:** 인터넷 λΈŒλΌμš°μ €λŠ” μΈν„°λ„·μ—μ„œ 정보λ₯Ό κ²€μƒ‰ν•˜κ³ , λ‹€λ₯Έ μ‚¬μš©μžμ™€ μ†Œν†΅ν•˜λŠ” 데 μ‚¬μš©λ˜λŠ” μ†Œν”„νŠΈμ›¨μ–΄μž…λ‹ˆλ‹€.
**μ‚¬μš©μž:** 건강을 μœ μ§€ν•˜κΈ° μœ„ν•œ μ„Έ 가지 νŒμ„ μ•Œλ €μ£Όμ„Έμš”.
**인곡지λŠ₯:** 1. μΆ©λΆ„ν•œ μˆ˜λ©΄μ„ μ·¨ν•˜μ„Έμš”.
2. κ±΄κ°•ν•œ 식단을 μ„­μ·¨ν•˜μ„Έμš”.
3. κ·œμΉ™μ μœΌλ‘œ μš΄λ™ν•˜μ„Έμš”.

데이터셋

Downloads last month
6
Safetensors
Model size
198M params
Tensor type
BF16
Β·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for blueapple8259/tzem-instruct

Base model

blueapple8259/tzem
Finetuned
(1)
this model
Finetunes
1 model

Dataset used to train blueapple8259/tzem-instruct