YAML Metadata Warning: empty or missing yaml metadata in repo card (https://huggingface.co/docs/hub/model-cards#model-card-metadata)
warmup_steps = 5,
num_train_epochs = 3,
learning_rate = 5e-5,
optim = 'adamw_torch', #"adamw_8bit",
weight_decay = 0.03, #L2 reg
lr_scheduler_type = "linear",
per_device_train_batch_size = 20, #max 20
gradient_accumulation_steps = 4,
adafactor = True,
use_liger = True,
Downloads last month

-

Downloads are not tracked for this model. How to track
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model's library.

Collection including moneco/Llama8B-1k-3-epoch