Loss was 1.668821 and still going down. A second epoch (or more data!) might help.
Uploaded model
- Developed by: paul-stansifer
- License: apache-2.0
- Finetuned from model : unsloth/mistral-7b-bnb-4bit
This mistral model was trained 2x faster with Unsloth and Huggingface's TRL library.
- Downloads last month
- 16
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model’s pipeline type.
Model tree for paul-stansifer/qw-mistral-1e-3-7b-gguf
Base model
unsloth/mistral-7b-bnb-4bit