--- library_name: transformers license: apache-2.0 base_model: openai/whisper-small tags: - generated_from_trainer metrics: - wer model-index: - name: whisper-a-nomi-18 results: [] --- # whisper-a-nomi-18 This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0346 - Wer: 14.4772 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 132 - num_epochs: 18 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-----:|:----:|:---------------:|:-------:| | No log | 1.0 | 88 | 0.0813 | 11.4388 | | 0.944 | 2.0 | 176 | 0.0636 | 11.2601 | | 0.1726 | 3.0 | 264 | 0.0395 | 16.1752 | | 0.0661 | 4.0 | 352 | 0.0895 | 25.7373 | | 0.145 | 5.0 | 440 | 0.0627 | 19.9285 | | 0.0218 | 6.0 | 528 | 0.0481 | 8.3110 | | 0.0187 | 7.0 | 616 | 0.0782 | 23.0563 | | 0.0282 | 8.0 | 704 | 0.0435 | 16.6220 | | 0.0282 | 9.0 | 792 | 0.0284 | 11.7069 | | 0.0055 | 10.0 | 880 | 0.0338 | 17.0688 | | 0.0027 | 11.0 | 968 | 0.0463 | 17.3369 | | 0.0039 | 12.0 | 1056 | 0.0362 | 11.6175 | | 0.0038 | 13.0 | 1144 | 0.0353 | 14.6559 | | 0.0014 | 14.0 | 1232 | 0.0347 | 14.5666 | | 0.0 | 15.0 | 1320 | 0.0346 | 14.4772 | | 0.0 | 16.0 | 1408 | 0.0346 | 14.4772 | | 0.0 | 17.0 | 1496 | 0.0346 | 14.4772 | | 0.0 | 18.0 | 1584 | 0.0346 | 14.4772 | ### Framework versions - Transformers 4.48.0.dev0 - Pytorch 2.4.0 - Datasets 3.1.0 - Tokenizers 0.21.0