metadata
base_model: unsloth/Qwen2-7B
library_name: peft
license: apache-2.0
tags:
- unsloth
- generated_from_trainer
model-index:
- name: Qwen2-7B_pct_ortho
results: []
Qwen2-7B_pct_ortho
This model is a fine-tuned version of unsloth/Qwen2-7B on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 2.0729
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0003
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 8
- total_train_batch_size: 64
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.02
- num_epochs: 1
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
2.0844 | 0.0206 | 8 | 1.9994 |
2.0447 | 0.0412 | 16 | 1.9901 |
2.0874 | 0.0618 | 24 | 2.0098 |
2.0412 | 0.0824 | 32 | 2.0256 |
2.0961 | 0.1031 | 40 | 2.0402 |
2.1029 | 0.1237 | 48 | 2.0545 |
2.1077 | 0.1443 | 56 | 2.0568 |
2.0953 | 0.1649 | 64 | 2.0666 |
2.1231 | 0.1855 | 72 | 2.0795 |
2.187 | 0.2061 | 80 | 2.0806 |
2.1587 | 0.2267 | 88 | 2.0897 |
2.1437 | 0.2473 | 96 | 2.0826 |
2.1689 | 0.2680 | 104 | 2.0951 |
2.0886 | 0.2886 | 112 | 2.1059 |
2.1436 | 0.3092 | 120 | 2.1058 |
2.1525 | 0.3298 | 128 | 2.1013 |
2.1577 | 0.3504 | 136 | 2.1073 |
2.1438 | 0.3710 | 144 | 2.1086 |
2.1596 | 0.3916 | 152 | 2.1189 |
2.1911 | 0.4122 | 160 | 2.1118 |
2.225 | 0.4329 | 168 | 2.1125 |
2.1364 | 0.4535 | 176 | 2.1086 |
2.1469 | 0.4741 | 184 | 2.1079 |
2.19 | 0.4947 | 192 | 2.1081 |
2.1236 | 0.5153 | 200 | 2.1063 |
2.1678 | 0.5359 | 208 | 2.1112 |
2.2087 | 0.5565 | 216 | 2.1102 |
2.1172 | 0.5771 | 224 | 2.0990 |
2.1274 | 0.5977 | 232 | 2.1043 |
2.1229 | 0.6184 | 240 | 2.0957 |
2.1097 | 0.6390 | 248 | 2.0978 |
2.1596 | 0.6596 | 256 | 2.0958 |
2.1294 | 0.6802 | 264 | 2.0876 |
2.1938 | 0.7008 | 272 | 2.0930 |
2.0728 | 0.7214 | 280 | 2.0849 |
2.1827 | 0.7420 | 288 | 2.0802 |
2.1082 | 0.7626 | 296 | 2.0804 |
2.1212 | 0.7833 | 304 | 2.0782 |
2.1074 | 0.8039 | 312 | 2.0761 |
2.1474 | 0.8245 | 320 | 2.0791 |
2.1557 | 0.8451 | 328 | 2.0802 |
2.1122 | 0.8657 | 336 | 2.0803 |
2.1462 | 0.8863 | 344 | 2.0764 |
2.119 | 0.9069 | 352 | 2.0713 |
2.1888 | 0.9275 | 360 | 2.0730 |
2.1138 | 0.9481 | 368 | 2.0733 |
2.1631 | 0.9688 | 376 | 2.0732 |
2.1142 | 0.9894 | 384 | 2.0729 |
Framework versions
- PEFT 0.12.0
- Transformers 4.44.0
- Pytorch 2.4.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1