--- license: apache-2.0 tags: - generated_from_keras_callback model-index: - name: hbenitez/AV_classifier_resnet50 results: [] --- # hbenitez/AV_classifier_resnet50 This model is a fine-tuned version of [microsoft/resnet-50](https://huggingface.co/microsoft/resnet-50) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 0.7480 - Validation Loss: 1.9590 - Train Accuracy: 0.7 - Epoch: 99 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': {'module': 'keras.optimizers.schedules', 'class_name': 'PolynomialDecay', 'config': {'initial_learning_rate': 3e-05, 'decay_steps': 8000, 'end_learning_rate': 0.0, 'power': 1.0, 'cycle': False, 'name': None}, 'registered_name': None}, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-08, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Train Accuracy | Epoch | |:----------:|:---------------:|:--------------:|:-----:| | 8.2909 | 8.8519 | 0.0 | 0 | | 8.1073 | 8.1722 | 0.0 | 1 | | 8.0937 | 7.9716 | 0.0 | 2 | | 7.8267 | 7.8585 | 0.0 | 3 | | 7.5311 | 7.6686 | 0.0 | 4 | | 7.5912 | 7.6775 | 0.0 | 5 | | 7.3261 | 7.5158 | 0.0 | 6 | | 7.1478 | 7.5385 | 0.0 | 7 | | 7.0659 | 7.4539 | 0.0 | 8 | | 6.8157 | 7.3734 | 0.0 | 9 | | 6.8977 | 7.2370 | 0.0 | 10 | | 6.6346 | 7.0331 | 0.0 | 11 | | 6.5901 | 6.9245 | 0.0 | 12 | | 6.4704 | 7.0762 | 0.0 | 13 | | 6.3974 | 6.8628 | 0.0 | 14 | | 6.0737 | 6.7022 | 0.0 | 15 | | 5.9180 | 6.6911 | 0.0 | 16 | | 5.6420 | 6.6420 | 0.0 | 17 | | 5.8163 | 6.6039 | 0.0 | 18 | | 5.5002 | 6.5215 | 0.0 | 19 | | 5.3426 | 6.4144 | 0.0 | 20 | | 5.4375 | 6.3628 | 0.05 | 21 | | 5.1789 | 6.2723 | 0.05 | 22 | | 5.1680 | 6.2523 | 0.05 | 23 | | 5.1469 | 6.1031 | 0.05 | 24 | | 4.9030 | 6.0026 | 0.0 | 25 | | 4.7596 | 5.8046 | 0.15 | 26 | | 4.8801 | 5.8023 | 0.1 | 27 | | 4.6371 | 5.8529 | 0.1 | 28 | | 4.4716 | 5.6872 | 0.1 | 29 | | 4.5640 | 5.5713 | 0.1 | 30 | | 4.2409 | 5.6985 | 0.05 | 31 | | 4.3464 | 5.6036 | 0.1 | 32 | | 4.0358 | 5.4160 | 0.1 | 33 | | 3.7727 | 5.2536 | 0.15 | 34 | | 3.8634 | 5.1261 | 0.2 | 35 | | 3.7902 | 5.0305 | 0.2 | 36 | | 3.5799 | 4.9175 | 0.25 | 37 | | 3.6493 | 4.8794 | 0.25 | 38 | | 3.3610 | 4.7168 | 0.2 | 39 | | 3.3305 | 4.7768 | 0.2 | 40 | | 3.2444 | 4.6929 | 0.25 | 41 | | 3.3055 | 4.7278 | 0.2 | 42 | | 3.0663 | 4.5155 | 0.2 | 43 | | 2.9070 | 4.4748 | 0.2 | 44 | | 3.0524 | 4.2340 | 0.2 | 45 | | 2.8021 | 4.1962 | 0.25 | 46 | | 2.7445 | 4.1974 | 0.25 | 47 | | 2.6257 | 4.0621 | 0.25 | 48 | | 2.4276 | 4.0370 | 0.3 | 49 | | 2.5626 | 4.0561 | 0.2 | 50 | | 2.4725 | 3.9501 | 0.3 | 51 | | 2.1471 | 3.8683 | 0.35 | 52 | | 2.2171 | 3.7830 | 0.3 | 53 | | 2.0710 | 3.8210 | 0.3 | 54 | | 1.9833 | 3.5905 | 0.35 | 55 | | 2.0103 | 3.5331 | 0.4 | 56 | | 1.7876 | 3.5856 | 0.4 | 57 | | 1.9404 | 3.5545 | 0.3 | 58 | | 1.8680 | 3.4422 | 0.4 | 59 | | 1.9024 | 3.4521 | 0.45 | 60 | | 1.7234 | 3.4420 | 0.45 | 61 | | 1.9552 | 3.4011 | 0.4 | 62 | | 1.6278 | 3.3911 | 0.35 | 63 | | 1.3892 | 3.3011 | 0.35 | 64 | | 1.4450 | 3.2232 | 0.45 | 65 | | 1.4787 | 3.2376 | 0.4 | 66 | | 1.3586 | 3.1092 | 0.5 | 67 | | 1.5565 | 3.1247 | 0.45 | 68 | | 1.3352 | 3.0486 | 0.45 | 69 | | 1.4656 | 2.9821 | 0.55 | 70 | | 1.4609 | 2.8628 | 0.5 | 71 | | 1.3140 | 2.7668 | 0.55 | 72 | | 1.2623 | 2.7777 | 0.55 | 73 | | 1.1311 | 2.7987 | 0.55 | 74 | | 1.3050 | 2.7233 | 0.6 | 75 | | 1.1644 | 2.6816 | 0.6 | 76 | | 1.0867 | 2.6325 | 0.6 | 77 | | 1.0870 | 2.6182 | 0.6 | 78 | | 1.0695 | 2.6422 | 0.6 | 79 | | 1.0438 | 2.6493 | 0.6 | 80 | | 1.0208 | 2.6355 | 0.6 | 81 | | 0.9287 | 2.4896 | 0.65 | 82 | | 1.0166 | 2.5370 | 0.6 | 83 | | 0.7797 | 2.6378 | 0.6 | 84 | | 0.7836 | 2.5321 | 0.65 | 85 | | 0.9135 | 2.4290 | 0.55 | 86 | | 0.9067 | 2.3287 | 0.65 | 87 | | 0.8000 | 2.2374 | 0.65 | 88 | | 0.8086 | 2.3477 | 0.65 | 89 | | 0.8166 | 2.2292 | 0.65 | 90 | | 1.0275 | 2.2574 | 0.65 | 91 | | 0.8453 | 2.1617 | 0.65 | 92 | | 0.6428 | 2.1317 | 0.7 | 93 | | 0.7761 | 2.0171 | 0.7 | 94 | | 0.7433 | 2.0812 | 0.75 | 95 | | 0.7227 | 2.2041 | 0.65 | 96 | | 0.6323 | 2.0665 | 0.7 | 97 | | 0.6911 | 2.0789 | 0.65 | 98 | | 0.7480 | 1.9590 | 0.7 | 99 | ### Framework versions - Transformers 4.30.2 - TensorFlow 2.13.0-rc2 - Datasets 2.13.1 - Tokenizers 0.13.3