CTMAE-P2-V5-3g-S4

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4441
  • Accuracy: 0.8444

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 13050

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6729 0.0100 131 0.7202 0.4667
0.4727 1.0100 262 0.9200 0.4667
0.6907 2.0100 393 0.9521 0.4667
0.5985 3.0100 524 0.9436 0.4667
0.932 4.0100 655 1.0027 0.4667
0.5625 5.0100 786 0.8652 0.5556
0.6848 6.0100 917 0.8245 0.4667
1.9843 7.0100 1048 1.0328 0.4667
1.0259 8.0100 1179 1.1008 0.4667
0.6501 9.0100 1310 0.6334 0.7333
0.7425 10.0100 1441 1.9972 0.4667
0.3816 11.0100 1572 0.8999 0.6222
0.5828 12.0100 1703 0.6653 0.6444
0.6965 13.0100 1834 1.1995 0.5778
1.1441 14.0100 1965 1.1112 0.6444
1.0802 15.0100 2096 1.1944 0.6889
0.6764 16.0100 2227 1.2079 0.6667
0.9995 17.0100 2358 0.4441 0.8444
0.1524 18.0100 2489 0.6179 0.7111
0.6087 19.0100 2620 0.5499 0.7111
0.643 20.0100 2751 1.3079 0.5556
0.7549 21.0100 2882 0.5691 0.7778
0.0275 22.0100 3013 0.8075 0.7778
0.7877 23.0100 3144 0.6420 0.7556
0.1937 24.0100 3275 0.9011 0.6889
0.5489 25.0100 3406 0.8769 0.7111
0.636 26.0100 3537 0.5532 0.8222
0.65 27.0100 3668 1.0277 0.7556
1.752 28.0100 3799 0.8335 0.7556
0.8998 29.0100 3930 0.6492 0.8
0.5018 30.0100 4061 0.8418 0.8
0.8232 31.0100 4192 0.8180 0.7778
1.0924 32.0100 4323 0.7991 0.8
0.6881 33.0100 4454 1.5892 0.7111
0.6649 34.0100 4585 1.4467 0.6889
0.5765 35.0100 4716 1.2852 0.6889
0.535 36.0100 4847 0.8898 0.7778
0.4813 37.0100 4978 0.7436 0.8
0.5694 38.0100 5109 0.8707 0.7778
0.1397 39.0100 5240 1.1453 0.7556
0.2083 40.0100 5371 1.1627 0.7778
1.0281 41.0100 5502 1.7993 0.7111
0.7729 42.0100 5633 0.9857 0.7778
0.1594 43.0100 5764 2.0440 0.6444
0.0016 44.0100 5895 0.7688 0.8222
0.0029 45.0100 6026 1.5397 0.7111
0.0032 46.0100 6157 1.4149 0.7556
0.9143 47.0100 6288 1.1824 0.7778
1.1312 48.0100 6419 1.0552 0.8222
0.5491 49.0100 6550 1.4101 0.7333
0.0021 50.0100 6681 0.9631 0.8222
0.769 51.0100 6812 1.6253 0.7111
0.5188 52.0100 6943 0.8261 0.8
0.4272 53.0100 7074 1.2140 0.8222
0.5066 54.0100 7205 1.3431 0.7556
0.3655 55.0100 7336 1.4484 0.7778
0.3319 56.0100 7467 1.6030 0.7333
0.0106 57.0100 7598 1.2037 0.8444
0.0008 58.0100 7729 1.9449 0.7333
0.0693 59.0100 7860 1.9971 0.7333
0.2319 60.0100 7991 0.9876 0.8222
0.6032 61.0100 8122 1.5705 0.7333
0.0361 62.0100 8253 1.1494 0.8
0.4365 63.0100 8384 1.5111 0.7556
0.0002 64.0100 8515 1.5259 0.7778
0.0074 65.0100 8646 2.0437 0.7333
0.7648 66.0100 8777 1.5944 0.8
0.1898 67.0100 8908 1.7697 0.7556
0.3616 68.0100 9039 1.4570 0.7778
0.4445 69.0100 9170 1.8239 0.7111
0.0008 70.0100 9301 1.2796 0.8222
0.3047 71.0100 9432 1.3140 0.8
0.4618 72.0100 9563 1.1418 0.8
0.0014 73.0100 9694 1.4893 0.7778
0.1784 74.0100 9825 1.3635 0.8222
0.0585 75.0100 9956 1.3869 0.8222
0.0001 76.0100 10087 1.5997 0.7556
0.0001 77.0100 10218 1.2910 0.8222
0.2202 78.0100 10349 1.0942 0.8444
0.0001 79.0100 10480 1.6201 0.7778
0.0021 80.0100 10611 1.7737 0.7556
0.2677 81.0100 10742 1.7019 0.7556
0.4637 82.0100 10873 1.4919 0.7778
0.0001 83.0100 11004 1.4477 0.8222
0.0 84.0100 11135 1.5797 0.8
0.0 85.0100 11266 1.6962 0.7778
0.0001 86.0100 11397 1.3234 0.8222
0.0 87.0100 11528 1.6734 0.7778
0.0001 88.0100 11659 1.5921 0.8
0.0018 89.0100 11790 1.6718 0.7778
0.0 90.0100 11921 1.5656 0.8
0.3858 91.0100 12052 1.6168 0.7778
0.0 92.0100 12183 1.6664 0.8
0.0001 93.0100 12314 1.6439 0.8
0.0 94.0100 12445 1.4756 0.8222
0.0001 95.0100 12576 1.5124 0.8
0.4576 96.0100 12707 1.4554 0.8444
0.0 97.0100 12838 1.4442 0.8444
0.0 98.0100 12969 1.4453 0.8444
0.0 99.0062 13050 1.4455 0.8444

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
16
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE-P2-V5-3g-S4

Finetuned
(49)
this model