--- license: apache-2.0 tags: - generated_from_keras_callback model-index: - name: hbenitez/AV_classifier_resnet50 results: [] --- # hbenitez/AV_classifier_resnet50 This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.8261 - Validation Loss: 2.4425 - Train Accuracy: 0.6 - Epoch: 99 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 8000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 8.2326 | 8.1060 | 0.0 | 0 | | 8.3420 | 7.6394 | 0.05 | 1 | | 7.9643 | 7.5706 | 0.05 | 2 | | 7.9337 | 7.6265 | 0.05 | 3 | | 7.8018 | 7.7736 | 0.05 | 4 | | 7.8009 | 7.7905 | 0.05 | 5 | | 7.6369 | 7.6354 | 0.05 | 6 | | 7.4782 | 7.5608 | 0.05 | 7 | | 7.3655 | 7.6271 | 0.05 | 8 | | 7.2886 | 7.6028 | 0.0 | 9 | | 7.1145 | 7.5211 | 0.0 | 10 | | 7.1232 | 7.2993 | 0.0 | 11 | | 6.8393 | 7.2079 | 0.0 | 12 | | 6.8202 | 7.2143 | 0.0 | 13 | | 6.7180 | 7.1236 | 0.05 | 14 | | 6.7318 | 7.1061 | 0.0 | 15 | | 6.4563 | 6.9758 | 0.05 | 16 | | 6.3765 | 6.9413 | 0.05 | 17 | | 6.1791 | 6.8315 | 0.05 | 18 | | 6.1946 | 6.7703 | 0.05 | 19 | | 5.8448 | 6.7431 | 0.1 | 20 | | 5.8514 | 6.6876 | 0.1 | 21 | | 5.8200 | 6.6353 | 0.05 | 22 | | 5.8323 | 6.5814 | 0.05 | 23 | | 5.5553 | 6.4306 | 0.05 | 24 | | 5.4999 | 6.4455 | 0.05 | 25 | | 5.4370 | 6.3026 | 0.05 | 26 | | 5.2288 | 6.0093 | 0.1 | 27 | | 5.2173 | 6.0593 | 0.05 | 28 | | 5.2280 | 6.0598 | 0.05 | 29 | | 5.0484 | 5.9769 | 0.05 | 30 | | 4.8703 | 5.8336 | 0.05 | 31 | | 4.9881 | 5.7711 | 0.1 | 32 | | 4.5905 | 5.6685 | 0.1 | 33 | | 4.7240 | 5.6156 | 0.15 | 34 | | 4.5095 | 5.4680 | 0.15 | 35 | | 4.2225 | 5.3962 | 0.15 | 36 | | 4.3615 | 5.3290 | 0.2 | 37 | | 4.1862 | 5.3602 | 0.15 | 38 | | 3.9455 | 5.2635 | 0.15 | 39 | | 3.9737 | 5.2337 | 0.15 | 40 | | 4.0922 | 5.1268 | 0.15 | 41 | | 3.6042 | 4.9972 | 0.2 | 42 | | 3.7219 | 4.8787 | 0.15 | 43 | | 3.5563 | 4.9075 | 0.2 | 44 | | 3.5897 | 4.9157 | 0.25 | 45 | | 3.5769 | 4.7936 | 0.25 | 46 | | 3.6225 | 4.8689 | 0.2 | 47 | | 3.4568 | 4.8767 | 0.2 | 48 | | 3.1431 | 4.7520 | 0.25 | 49 | | 3.0607 | 4.5815 | 0.3 | 50 | | 2.8904 | 4.5007 | 0.2 | 51 | | 2.8308 | 4.5054 | 0.25 | 52 | | 2.8136 | 4.2745 | 0.25 | 53 | | 2.6192 | 4.3300 | 0.2 | 54 | | 2.5308 | 4.3180 | 0.2 | 55 | | 2.5192 | 4.2706 | 0.2 | 56 | | 2.5761 | 4.1395 | 0.25 | 57 | | 2.3516 | 3.9031 | 0.3 | 58 | | 2.3231 | 3.8172 | 0.35 | 59 | | 2.2735 | 3.7651 | 0.35 | 60 | | 2.1215 | 3.8034 | 0.35 | 61 | | 2.3229 | 3.8096 | 0.35 | 62 | | 2.2230 | 3.7000 | 0.35 | 63 | | 1.9059 | 3.6666 | 0.25 | 64 | | 2.0289 | 3.6743 | 0.25 | 65 | | 1.9178 | 3.5819 | 0.3 | 66 | | 2.0295 | 3.5087 | 0.35 | 67 | | 1.6499 | 3.4962 | 0.4 | 68 | | 1.6261 | 3.4146 | 0.3 | 69 | | 1.7059 | 3.4097 | 0.35 | 70 | | 1.4837 | 3.2702 | 0.35 | 71 | | 1.3766 | 3.2214 | 0.4 | 72 | | 1.5898 | 3.2674 | 0.4 | 73 | | 1.5002 | 3.1907 | 0.4 | 74 | | 1.2641 | 3.1176 | 0.4 | 75 | | 1.3456 | 3.1562 | 0.4 | 76 | | 1.2655 | 2.9548 | 0.5 | 77 | | 1.5449 | 2.8738 | 0.5 | 78 | | 1.2519 | 2.8336 | 0.45 | 79 | | 1.0682 | 2.8478 | 0.35 | 80 | | 1.1891 | 2.8408 | 0.5 | 81 | | 1.2920 | 2.6254 | 0.5 | 82 | | 1.1239 | 2.7507 | 0.5 | 83 | | 1.0857 | 2.7772 | 0.4 | 84 | | 0.9821 | 2.8372 | 0.45 | 85 | | 1.0457 | 2.8636 | 0.45 | 86 | | 1.1419 | 2.8426 | 0.45 | 87 | | 1.0782 | 2.7856 | 0.5 | 88 | | 0.9906 | 2.6826 | 0.55 | 89 | | 1.0766 | 2.6707 | 0.5 | 90 | | 1.1115 | 2.6457 | 0.5 | 91 | | 1.2201 | 2.6838 | 0.55 | 92 | | 0.8706 | 2.5262 | 0.55 | 93 | | 0.7441 | 2.5422 | 0.55 | 94 | | 0.9710 | 2.4211 | 0.6 | 95 | | 0.9731 | 2.4090 | 0.6 | 96 | | 0.8942 | 2.3773 | 0.6 | 97 | | 1.0461 | 2.4159 | 0.55 | 98 | | 0.8261 | 2.4425 | 0.6 | 99 | ### Framework versions - Transformers 4.30.2 - TensorFlow 2.13.0-rc2 - Datasets 2.13.1 - Tokenizers 0.13.3