wav2vec2-xls-r-1b-E5-faroese-100h-30-epochs_20250124

This model is a fine-tuned version of davidilag/wav2vec2-xls-r-1b-scandinavian-E5-100h-30-epochs-20250124 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.1020
  • Wer: 18.7866
  • Cer: 4.0428

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 32
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_steps: 5000
  • num_epochs: 30
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
0.6462 0.4877 1000 0.4463 49.8480 14.3603
0.4739 0.9754 2000 0.2744 35.1720 9.4284
0.3953 1.4628 3000 0.2147 31.1671 8.0981
0.3758 1.9505 4000 0.2073 31.1847 7.9119
0.3123 2.4379 5000 0.2123 29.9423 7.7486
0.3032 2.9256 6000 0.1951 29.8277 7.5569
0.2866 3.4131 7000 0.1822 28.2372 7.1364
0.2601 3.9008 8000 0.1833 27.0432 6.8018
0.2259 4.3882 9000 0.1809 26.6996 6.7458
0.2474 4.8759 10000 0.1606 26.1312 6.4500
0.2131 5.3633 11000 0.1674 26.1929 6.5005
0.214 5.8510 12000 0.1550 24.8888 6.0910
0.181 6.3385 13000 0.1583 24.7918 6.1367
0.1703 6.8261 14000 0.1457 24.9592 6.0444
0.1816 7.3136 15000 0.1578 24.6024 5.9979
0.1594 7.8013 16000 0.1482 24.3997 5.8661
0.1373 8.2887 17000 0.1485 24.0428 5.7170
0.1497 8.7764 18000 0.1383 23.8049 5.7265
0.1119 9.2638 19000 0.1379 23.0956 5.5213
0.1218 9.7515 20000 0.1504 23.6815 5.7186
0.1177 10.2390 21000 0.1395 23.4392 5.6199
0.1128 10.7267 22000 0.1383 23.3643 5.5813
0.1198 11.2141 23000 0.1360 22.7783 5.3438
0.1105 11.7018 24000 0.1375 22.5977 5.2996
0.1035 12.1892 25000 0.1252 22.4391 5.2736
0.092 12.6769 26000 0.1323 22.2629 5.2397
0.0783 13.1644 27000 0.1286 22.2717 5.1442
0.0835 13.6520 28000 0.1298 21.6284 4.9619
0.0702 14.1395 29000 0.1192 21.5447 4.9091
0.0807 14.6272 30000 0.1177 21.3773 4.9493
0.0714 15.1146 31000 0.1254 21.3112 4.8972
0.0734 15.6023 32000 0.1216 21.2980 4.8554
0.0621 16.0897 33000 0.1191 20.8618 4.7118
0.0601 16.5774 34000 0.1134 20.7913 4.6747
0.0631 17.0649 35000 0.1148 20.6327 4.6384
0.0655 17.5525 36000 0.1106 20.4697 4.5769
0.0492 18.0400 37000 0.1172 20.4520 4.5880
0.0485 18.5277 38000 0.1180 20.3066 4.6022
0.0455 19.0151 39000 0.1102 20.0511 4.4349
0.0422 19.5028 40000 0.1143 20.0511 4.4467
0.0412 19.9905 41000 0.1109 19.8749 4.3978
0.0469 20.4779 42000 0.1110 20.0203 4.4428
0.0388 20.9656 43000 0.1084 19.7163 4.3410
0.0357 21.4531 44000 0.1081 19.5356 4.3016
0.043 21.9407 45000 0.1043 19.2404 4.2211
0.027 22.4282 46000 0.1074 19.2801 4.2250
0.0344 22.9159 47000 0.1091 19.3374 4.2124
0.0306 23.4033 48000 0.1083 19.2096 4.1982
0.033 23.8910 49000 0.1037 19.1259 4.1611
0.0309 24.3784 50000 0.1071 19.1743 4.1840
0.0246 24.8661 51000 0.0986 19.1127 4.1438
0.0299 25.3536 52000 0.1045 18.9673 4.1098
0.0296 25.8413 53000 0.1013 18.9717 4.0901
0.0272 26.3287 54000 0.1023 18.7822 4.0404
0.0225 26.8164 55000 0.1032 18.7690 4.0380
0.0206 27.3038 56000 0.1020 18.7734 4.0436
0.0273 27.7915 57000 0.1020 18.8131 4.0483
0.0267 28.2790 58000 0.1015 18.8131 4.0499
0.0268 28.7666 59000 0.1020 18.7866 4.0428
0.0307 29.2541 60000 0.1020 18.7822 4.0436
0.033 29.7418 61000 0.1020 18.7866 4.0428

Framework versions

  • Transformers 4.48.1
  • Pytorch 2.5.1+cu124
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
2
Safetensors
Model size
963M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for davidilag/wav2vec2-xls-r-1b-E5-faroese-100h-30-epochs_20250124