xlm-r-dra-tam-mal-aw-classification-finetune-loreft
This model is a fine-tuned version of livinNector/m-minilm-l12-h384-data-augumented-dra-tam-mal-aw-classification-finetune on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.6767
- Accuracy: 0.7706
- F1: 0.8100
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 128
- eval_batch_size: 256
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 6
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
0.6349 | 0.2222 | 20 | 0.6025 | 0.7685 | 0.8155 |
0.4398 | 0.4444 | 40 | 0.5082 | 0.7734 | 0.8173 |
0.2523 | 0.6667 | 60 | 0.6233 | 0.7751 | 0.8162 |
0.1855 | 0.8889 | 80 | 0.7233 | 0.7722 | 0.8127 |
0.2282 | 1.1111 | 100 | 0.6755 | 0.7742 | 0.8147 |
0.2111 | 1.3333 | 120 | 0.6446 | 0.7726 | 0.8122 |
0.2218 | 1.5556 | 140 | 0.6405 | 0.7730 | 0.8120 |
0.2155 | 1.7778 | 160 | 0.6697 | 0.7718 | 0.8116 |
0.2181 | 2.0 | 180 | 0.6506 | 0.7718 | 0.8121 |
0.2066 | 2.2222 | 200 | 0.6520 | 0.7718 | 0.8116 |
0.2164 | 2.4444 | 220 | 0.6621 | 0.7714 | 0.8118 |
0.2038 | 2.6667 | 240 | 0.6712 | 0.7710 | 0.8114 |
0.21 | 2.8889 | 260 | 0.6707 | 0.7714 | 0.8116 |
0.2037 | 3.1111 | 280 | 0.6775 | 0.7714 | 0.8113 |
0.2055 | 3.3333 | 300 | 0.6732 | 0.7714 | 0.8112 |
0.2072 | 3.5556 | 320 | 0.6748 | 0.7714 | 0.8112 |
0.2093 | 3.7778 | 340 | 0.6706 | 0.7718 | 0.8114 |
0.2119 | 4.0 | 360 | 0.6641 | 0.7714 | 0.8110 |
0.2075 | 4.2222 | 380 | 0.6644 | 0.7714 | 0.8112 |
0.2035 | 4.4444 | 400 | 0.6710 | 0.7718 | 0.8116 |
0.2128 | 4.6667 | 420 | 0.6712 | 0.7718 | 0.8112 |
0.1991 | 4.8889 | 440 | 0.6745 | 0.7714 | 0.8110 |
0.2018 | 5.1111 | 460 | 0.6754 | 0.7710 | 0.8104 |
0.2002 | 5.3333 | 480 | 0.6818 | 0.7710 | 0.8104 |
0.2098 | 5.5556 | 500 | 0.6790 | 0.7706 | 0.8100 |
0.2083 | 5.7778 | 520 | 0.6771 | 0.7706 | 0.8100 |
0.2202 | 6.0 | 540 | 0.6767 | 0.7706 | 0.8100 |
Framework versions
- Transformers 4.45.2
- Pytorch 2.4.1+cu121
- Datasets 3.2.0
- Tokenizers 0.20.3
Inference Providers
NEW
This model is not currently available via any of the supported third-party Inference Providers, and
HF Inference API was unable to determine this model’s pipeline type.
Model tree for livinNector/xlm-r-dra-tam-mal-aw-classification-finetune-loreft
Base model
microsoft/Multilingual-MiniLM-L12-H384