CTMAE2_CS_V7_6

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.8260
  • Accuracy: 0.8261

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 5
  • eval_batch_size: 5
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 7750

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6413 0.0201 156 0.7948 0.4565
0.5022 1.0201 312 0.8549 0.4565
0.6512 2.0201 468 0.6671 0.5870
0.5026 3.0201 624 0.6542 0.6522
0.5752 4.0201 780 0.6096 0.7174
0.5908 5.0201 936 0.7501 0.5217
0.4882 6.0201 1092 0.7652 0.6304
0.4128 7.0201 1248 0.7746 0.6522
0.4414 8.0201 1404 0.5973 0.7174
0.4291 9.0201 1560 0.7594 0.7391
0.2729 10.0201 1716 0.5485 0.8043
0.5803 11.0201 1872 0.7238 0.6957
0.4601 12.0201 2028 0.7228 0.7174
0.1306 13.0201 2184 0.9496 0.7174
0.4727 14.0201 2340 0.8971 0.7174
0.4027 15.0201 2496 0.6291 0.7391
0.3149 16.0201 2652 0.8639 0.6957
0.1737 17.0201 2808 1.0473 0.6957
0.2368 18.0201 2964 0.8658 0.7174
0.1155 19.0201 3120 0.7655 0.8043
0.156 20.0201 3276 0.7960 0.7609
0.2685 21.0201 3432 0.8260 0.8261
0.2572 22.0201 3588 0.8299 0.7391
0.3788 23.0201 3744 0.8373 0.8043
0.3816 24.0201 3900 0.9689 0.7609
0.4579 25.0201 4056 1.2806 0.6739
0.2543 26.0201 4212 1.2309 0.7609
0.1227 27.0201 4368 1.2931 0.6522
0.3303 28.0201 4524 1.0450 0.8043
0.0808 29.0201 4680 1.3096 0.7391
0.0987 30.0201 4836 1.1349 0.7826
0.0432 31.0201 4992 1.0443 0.7391
0.0373 32.0201 5148 1.8532 0.6739
0.1619 33.0201 5304 1.0490 0.7391
0.096 34.0201 5460 1.2420 0.7826
0.0112 35.0201 5616 1.4820 0.7609
0.0282 36.0201 5772 1.3097 0.7826
0.1689 37.0201 5928 1.6520 0.6739
0.0005 38.0201 6084 1.9084 0.6739
0.1626 39.0201 6240 1.3166 0.8043
0.08 40.0201 6396 1.4161 0.8043
0.062 41.0201 6552 1.4863 0.7391
0.1915 42.0201 6708 1.6604 0.6522
0.365 43.0201 6864 1.4169 0.8261
0.0001 44.0201 7020 1.4883 0.8043
0.0014 45.0201 7176 1.5122 0.8043
0.0024 46.0201 7332 1.4808 0.8043
0.0001 47.0201 7488 1.4890 0.7826
0.0001 48.0201 7644 1.5129 0.7391
0.0001 49.0137 7750 1.5114 0.7609

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
0
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE2_CS_V7_6

Finetuned
(49)
this model