CTMAE-P2-V4-S4

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.4928
  • Accuracy: 0.8043

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 13050

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6322 0.0100 131 0.6958 0.5435
0.3448 1.0100 262 1.4634 0.5435
1.1225 2.0100 393 2.0266 0.5435
0.7246 3.0100 524 0.9006 0.5435
1.2784 4.0100 655 1.6206 0.5435
0.7234 5.0100 786 1.7217 0.5435
0.7544 6.0100 917 1.4504 0.5435
1.732 7.0100 1048 1.1581 0.5435
0.8227 8.0100 1179 1.9053 0.5435
0.7839 9.0100 1310 0.9410 0.5435
0.8302 10.0100 1441 1.5093 0.5435
0.6264 11.0100 1572 1.7408 0.5435
0.5032 12.0100 1703 0.7154 0.5
1.1847 13.0100 1834 1.1743 0.5435
0.9721 14.0100 1965 1.7714 0.5435
0.6003 15.0100 2096 0.8652 0.5870
0.4912 16.0100 2227 1.7541 0.5435
0.8106 17.0100 2358 1.0464 0.5652
1.2365 18.0100 2489 0.7472 0.6739
1.7469 19.0100 2620 1.3125 0.6304
0.2345 20.0100 2751 1.0220 0.6087
0.483 21.0100 2882 1.2559 0.6087
1.5409 22.0100 3013 1.6619 0.5435
1.1284 23.0100 3144 1.0519 0.6739
0.4471 24.0100 3275 2.1155 0.5652
0.2323 25.0100 3406 1.6991 0.6304
0.871 26.0100 3537 1.4254 0.6957
0.4976 27.0100 3668 1.8011 0.6304
0.5621 28.0100 3799 1.6148 0.6739
0.9854 29.0100 3930 1.4576 0.6522
0.0018 30.0100 4061 1.5995 0.7174
0.3031 31.0100 4192 1.5070 0.6957
0.8871 32.0100 4323 1.7620 0.6522
0.6212 33.0100 4454 1.7319 0.6739
0.5674 34.0100 4585 1.8520 0.6739
0.2845 35.0100 4716 1.8629 0.6522
0.1611 36.0100 4847 1.7524 0.6522
0.0779 37.0100 4978 1.5949 0.6739
0.6805 38.0100 5109 2.1198 0.6739
0.0297 39.0100 5240 2.1019 0.6739
0.0005 40.0100 5371 2.3706 0.6739
0.4209 41.0100 5502 1.3258 0.6957
0.2219 42.0100 5633 1.9883 0.6957
0.0184 43.0100 5764 2.0343 0.6522
0.001 44.0100 5895 1.9996 0.6957
0.4252 45.0100 6026 1.9136 0.6522
0.0456 46.0100 6157 1.9553 0.6739
0.375 47.0100 6288 1.9227 0.6957
0.6046 48.0100 6419 2.6295 0.6087
0.2836 49.0100 6550 1.7961 0.7174
0.1522 50.0100 6681 1.3961 0.7826
0.6705 51.0100 6812 1.7068 0.7391
0.0005 52.0100 6943 1.7986 0.7391
0.02 53.0100 7074 1.6991 0.7609
0.0037 54.0100 7205 1.5867 0.7391
0.2488 55.0100 7336 1.4928 0.8043
0.0297 56.0100 7467 1.8699 0.7174
0.0003 57.0100 7598 2.1014 0.7174
0.0008 58.0100 7729 1.9651 0.6739
0.2982 59.0100 7860 2.5969 0.6522
0.3197 60.0100 7991 2.3923 0.6087
0.012 61.0100 8122 2.4473 0.6522
0.0002 62.0100 8253 2.1692 0.6957
0.0002 63.0100 8384 2.3358 0.6739
0.0001 64.0100 8515 2.6785 0.6739
0.364 65.0100 8646 2.7085 0.6522
0.0001 66.0100 8777 2.8955 0.6522
0.0002 67.0100 8908 2.2053 0.7391
0.0002 68.0100 9039 2.6436 0.6739
0.0001 69.0100 9170 2.6494 0.6739
0.0046 70.0100 9301 2.2621 0.7391
0.0001 71.0100 9432 2.9285 0.6739
0.0001 72.0100 9563 2.4097 0.6957
0.0001 73.0100 9694 2.8739 0.6304
0.0004 74.0100 9825 2.8154 0.6304
0.3257 75.0100 9956 2.3350 0.6957
0.0001 76.0100 10087 1.9011 0.7391
0.0001 77.0100 10218 2.3655 0.7174
0.0001 78.0100 10349 2.6572 0.6739
0.0001 79.0100 10480 2.6350 0.6739
0.4185 80.0100 10611 2.4854 0.7174
0.0001 81.0100 10742 2.4658 0.7391
0.0 82.0100 10873 2.6691 0.6957
0.0001 83.0100 11004 2.7930 0.5870
0.0001 84.0100 11135 2.5645 0.6957
0.0001 85.0100 11266 2.5759 0.7174
0.0 86.0100 11397 2.6901 0.6957
0.0 87.0100 11528 2.6050 0.6957
0.0 88.0100 11659 3.0276 0.6087
0.0001 89.0100 11790 2.9324 0.6739
0.0 90.0100 11921 2.9194 0.6739
0.0 91.0100 12052 2.5726 0.7391
0.0 92.0100 12183 2.8832 0.6739
0.0001 93.0100 12314 3.0274 0.6304
0.0001 94.0100 12445 2.8242 0.6957
0.0 95.0100 12576 2.7715 0.6957
0.0 96.0100 12707 2.7907 0.6957
0.4392 97.0100 12838 2.7856 0.6957
0.0 98.0100 12969 2.7755 0.6957
0.0 99.0062 13050 2.7569 0.6957

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
9
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE-P2-V4-S4

Finetuned
(40)
this model