--- library_name: transformers language: - hu license: apache-2.0 base_model: openai/whisper-small tags: - generated_from_trainer metrics: - wer model-index: - name: whisper-small-finetuned-hu results: [] --- # whisper-small-finetuned-hu This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the custom dataset. It achieves the following results on the evaluation set: - Loss: 0.02658 - Wer: 0.08494 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 2.5e-05 - train_batch_size: 32 - eval_batch_size: 32 - seed: 42 - distributed_type: multi-GPU - num_devices: 2 - total_train_batch_size: 64 - total_eval_batch_size: 64 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 500 - num_epochs: 2 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:------:|:-----:|:---------------:|:------:| | 0.0559 | 0.0902 | 2000 | 0.0575 | 0.2634 | | 0.0481 | 0.1804 | 4000 | 0.0488 | 0.1917 | | 0.0415 | 0.2707 | 6000 | 0.0438 | 0.1329 | | 0.0408 | 0.3609 | 8000 | 0.0408 | 0.1234 | | 0.0393 | 0.4511 | 10000 | 0.0388 | 0.1173 | | 0.0375 | 0.5413 | 12000 | 0.0372 | 0.1119 | | 0.0342 | 0.6316 | 14000 | 0.0357 | 0.1101 | | 0.0335 | 0.7218 | 16000 | 0.0349 | 0.1071 | | 0.0323 | 0.8120 | 18000 | 0.0331 | 0.1037 | | 0.0325 | 0.9022 | 20000 | 0.0326 | 0.1035 | | 0.0305 | 0.9925 | 22000 | 0.0315 | 0.0974 | | 0.02 | 1.0827 | 24000 | 0.0312 | 0.0992 | | 0.0207 | 1.1729 | 26000 | 0.0310 | 0.0937 | | 0.0203 | 1.2631 | 28000 | 0.0301 | 0.0941 | | 0.0215 | 1.3534 | 30000 | 0.0296 | 0.0913 | | 0.0199 | 1.4436 | 32000 | 0.0289 | 0.0911 | | 0.0197 | 1.5338 | 34000 | 0.0285 | 0.0890 | | 0.0187 | 1.6240 | 36000 | 0.0279 | 0.0887 | | 0.0188 | 1.7143 | 38000 | 0.0276 | 0.0882 | | 0.0186 | 1.8045 | 40000 | 0.0271 | 0.0856 | | 0.0181 | 1.8947 | 42000 | 0.0266 | 0.0849 | | 0.0176 | 1.9849 | 44000 | 0.0264 | 0.0863 | ### Framework versions - Transformers 4.47.0 - Pytorch 2.5.1+cu118 - Datasets 3.1.0 - Tokenizers 0.21.0