You need to agree to share your contact information to access this model

This repository is publicly accessible, but you have to accept the conditions to access its files and content.

Log in or Sign Up to review the conditions and access this model content.

whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-5hrs-v1

This model is a fine-tuned version of openai/whisper-small on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.9246
  • Wer: 0.2946
  • Cer: 0.1160

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 2
  • seed: 42
  • gradient_accumulation_steps: 2
  • total_train_batch_size: 8
  • optimizer: Use adamw_hf with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 100
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Wer Cer
3.2458 1.0 197 1.3723 1.0300 0.5178
1.1335 2.0 394 0.9104 0.8444 0.4119
0.7565 3.0 591 0.7383 0.7341 0.3627
0.5358 4.0 788 0.6557 0.7250 0.3796
0.3661 5.0 985 0.6261 0.8424 0.4895
0.2372 6.0 1182 0.6162 0.7220 0.4038
0.1402 7.0 1379 0.6452 0.8728 0.5281
0.0823 8.0 1576 0.6453 0.9265 0.5762
0.0506 9.0 1773 0.6683 0.7950 0.4716
0.038 10.0 1970 0.6756 0.8420 0.5188
0.0288 11.0 2167 0.7030 0.6811 0.3833
0.0218 12.0 2364 0.7101 0.4698 0.2265
0.0156 13.0 2561 0.7235 0.6326 0.3551
0.0119 14.0 2758 0.7426 0.4402 0.2030
0.0104 15.0 2955 0.7523 0.5167 0.2593
0.0073 16.0 3152 0.7586 0.4773 0.2402
0.0062 17.0 3349 0.7481 0.4544 0.2327
0.004 18.0 3546 0.7596 0.4101 0.1972
0.0022 19.0 3743 0.7582 0.3821 0.1676
0.0023 20.0 3940 0.7786 0.4205 0.2031
0.0021 21.0 4137 0.7690 0.4023 0.1838
0.0021 22.0 4334 0.7817 0.4109 0.1830
0.0016 23.0 4531 0.7935 0.4091 0.1793
0.0021 24.0 4728 0.7997 0.4872 0.2415
0.0045 25.0 4925 0.7827 0.3801 0.1615
0.0065 26.0 5122 0.8009 0.3772 0.1667
0.0056 27.0 5319 0.7998 0.3757 0.1680
0.0053 28.0 5516 0.8201 0.3383 0.1376
0.0045 29.0 5713 0.8074 0.4136 0.1972
0.0033 30.0 5910 0.8100 0.3300 0.1318
0.0012 31.0 6107 0.8238 0.3600 0.1674
0.0013 32.0 6304 0.8251 0.3342 0.1314
0.0008 33.0 6501 0.8447 0.3166 0.1208
0.0015 34.0 6698 0.8350 0.3079 0.1159
0.001 35.0 6895 0.8499 0.3329 0.1364
0.0021 36.0 7092 0.8348 0.3158 0.1225
0.0012 37.0 7289 0.8511 0.2980 0.1062
0.0014 38.0 7486 0.8434 0.3392 0.1386
0.0018 39.0 7683 0.8632 0.3224 0.1284
0.0021 40.0 7880 0.8481 0.3721 0.1738
0.0021 41.0 8077 0.8448 0.3692 0.1645
0.0033 42.0 8274 0.8678 0.3424 0.1474
0.0011 43.0 8471 0.8617 0.3218 0.1269
0.0012 44.0 8668 0.8570 0.2992 0.1180
0.0007 45.0 8865 0.8609 0.2970 0.1115
0.0008 46.0 9062 0.8931 0.3132 0.1185
0.001 47.0 9259 0.8838 0.3127 0.1291
0.0005 48.0 9456 0.8732 0.2906 0.1145
0.0026 49.0 9653 0.8671 0.2949 0.1120
0.0014 50.0 9850 0.8750 0.3150 0.1308
0.0009 51.0 10047 0.8786 0.3084 0.1292
0.0007 52.0 10244 0.8850 0.3007 0.1132
0.0005 53.0 10441 0.8919 0.2918 0.1058
0.0003 54.0 10638 0.8893 0.2980 0.1156
0.0002 55.0 10835 0.9022 0.2953 0.1155
0.0 56.0 11032 0.9002 0.2849 0.1020
0.0005 57.0 11229 0.9138 0.2853 0.1032
0.0007 58.0 11426 0.8995 0.2957 0.1158
0.0004 59.0 11623 0.8997 0.2854 0.1061
0.0013 60.0 11820 0.9066 0.2837 0.0993
0.002 61.0 12017 0.9002 0.3039 0.1150
0.001 62.0 12214 0.9220 0.2925 0.1088
0.0014 63.0 12411 0.9129 0.3056 0.1212
0.0009 64.0 12608 0.9137 0.2844 0.1051
0.0008 65.0 12805 0.9128 0.3075 0.1307
0.0005 66.0 13002 0.9248 0.2870 0.1040
0.0004 67.0 13199 0.9281 0.2855 0.1011
0.0007 68.0 13396 0.9097 0.2883 0.1079
0.0005 69.0 13593 0.9092 0.2981 0.1158
0.0004 70.0 13790 0.9170 0.2977 0.1133
0.0002 71.0 13987 0.9252 0.3131 0.1272
0.001 72.0 14184 0.9246 0.2946 0.1160

Framework versions

  • Transformers 4.46.3
  • Pytorch 2.1.0+cu118
  • Datasets 3.1.0
  • Tokenizers 0.20.3
Downloads last month
18
Safetensors
Model size
242M params
Tensor type
F32
·
Inference Examples
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for asr-africa/whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-5hrs-v1

Finetuned
(2155)
this model

Collection including asr-africa/whisper-small-Fleurs_AMMI_AFRIVOICE_LRSC-ln-5hrs-v1