whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-109hrs-v1

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9074
  • Wer: 0.2063
  • Cer: 0.0793

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use adamw_hf with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
1.4677 1.0 2880 0.7253 0.6948 0.3565
0.6293 2.0 5760 0.5281 0.5550 0.3003
0.4895 3.0 8640 0.4502 0.4045 0.2119
0.3993 4.0 11520 0.4109 0.4834 0.2918
0.3241 5.0 14400 0.3989 0.4870 0.2655
0.2551 6.0 17280 0.4082 0.4294 0.2432
0.1918 7.0 20160 0.4232 0.3710 0.1913
0.1369 8.0 23040 0.4381 0.4223 0.2191
0.0967 9.0 25920 0.4774 0.3570 0.1636
0.0709 10.0 28800 0.4869 0.3268 0.1283
0.0544 11.0 31680 0.5251 0.2836 0.1034
0.04 12.0 34560 0.5354 0.2856 0.1043
0.0309 13.0 37440 0.5464 0.2683 0.0985
0.0247 14.0 40320 0.5621 0.2640 0.0933
0.021 15.0 43200 0.5743 0.2510 0.0862
0.0185 16.0 46080 0.5972 0.2540 0.0893
0.0161 17.0 48960 0.5969 0.2399 0.0820
0.0143 18.0 51840 0.6073 0.2394 0.0805
0.013 19.0 54720 0.6256 0.2350 0.0814
0.0117 20.0 57600 0.6198 0.2298 0.0800
0.0103 21.0 60480 0.6410 0.2330 0.0830
0.0097 22.0 63360 0.6626 0.2373 0.0821
0.009 23.0 66240 0.6666 0.2279 0.0770
0.0083 24.0 69120 0.6686 0.2269 0.0779
0.0075 25.0 72000 0.6851 0.2220 0.0793
0.0071 26.0 74880 0.6944 0.2261 0.0784
0.0065 27.0 77760 0.7010 0.2240 0.0793
0.0058 28.0 80640 0.6999 0.2282 0.0812
0.0055 29.0 83520 0.7135 0.2208 0.0778
0.0054 30.0 86400 0.7268 0.2240 0.0806
0.0047 31.0 89280 0.7210 0.2208 0.0774
0.0045 32.0 92160 0.7255 0.2180 0.0776
0.0045 33.0 95040 0.7512 0.2179 0.0776
0.0037 34.0 97920 0.7662 0.2177 0.0812
0.0038 35.0 100800 0.7562 0.2087 0.0757
0.0036 36.0 103680 0.7524 0.2093 0.0740
0.0033 37.0 106560 0.7719 0.2136 0.0783
0.003 38.0 109440 0.7792 0.2149 0.0764
0.0029 39.0 112320 0.7883 0.2135 0.0761
0.0029 40.0 115200 0.7960 0.2098 0.0750
0.0026 41.0 118080 0.7779 0.2072 0.0750
0.0026 42.0 120960 0.7881 0.2155 0.0784
0.0024 43.0 123840 0.7936 0.2076 0.0740
0.0022 44.0 126720 0.8039 0.2071 0.0735
0.0022 45.0 129600 0.8125 0.2055 0.0740
0.0021 46.0 132480 0.8248 0.2118 0.0749
0.0019 47.0 135360 0.8211 0.2119 0.0777
0.0017 48.0 138240 0.8212 0.2100 0.0783
0.0019 49.0 141120 0.8303 0.2052 0.0744
0.0018 50.0 144000 0.8337 0.2059 0.0761
0.0016 51.0 146880 0.8529 0.2100 0.0776
0.0013 52.0 149760 0.8497 0.2109 0.0788
0.0014 53.0 152640 0.8656 0.2115 0.0777
0.0013 54.0 155520 0.8556 0.2068 0.0784
0.0012 55.0 158400 0.8539 0.2058 0.0777
0.0012 56.0 161280 0.8640 0.2006 0.0730
0.0011 57.0 164160 0.8455 0.2037 0.0770
0.0011 58.0 167040 0.8684 0.2068 0.0778
0.001 59.0 169920 0.8569 0.2057 0.0771
0.0009 60.0 172800 0.8669 0.2023 0.0741
0.001 61.0 175680 0.8780 0.2069 0.0785
0.0008 62.0 178560 0.8928 0.2068 0.0763
0.0007 63.0 181440 0.8991 0.2052 0.0750
0.0009 64.0 184320 0.9090 0.2093 0.0799
0.0007 65.0 187200 0.8953 0.2011 0.0754
0.0006 66.0 190080 0.9095 0.2059 0.0772
0.0006 67.0 192960 0.9042 0.2004 0.0766
0.0005 68.0 195840 0.9074 0.2063 0.0793

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
7
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-109hrs-v1

Finetuned
(2155)
this model