YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)

mobilellm-125m-finetuned-wikitext-originaldtst-roya

This model is a fine-tuned version of facebook/MobileLLM-125M on the Wikitext-2 dataset.

Model description

MobileLLM-125M fine-tuned on Wikitext-2 dataset for improved language modeling.

Training procedure

The model was fine-tuned on the Wikitext-2 dataset using standard supervised fine-tuning.

Usage

from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("mia-llm/mobilellm-125m-finetuned-wikitext-originaldtst-roya")
tokenizer = AutoTokenizer.from_pretrained("mia-llm/mobilellm-125m-finetuned-wikitext-originaldtst-roya")
Downloads last month
0
Safetensors
Model size
125M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .