CTMAE-P2-V5-3g-S2

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.1479
  • Accuracy: 0.8222

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 13050

Training results

Training Loss Epoch Step Validation Loss Accuracy
1.3342 0.02 261 1.3344 0.4667
0.5407 1.02 522 3.0926 0.4667
1.9217 2.02 783 2.3252 0.4667
0.6849 3.02 1044 2.6546 0.4667
1.7893 4.02 1305 1.1093 0.4667
0.8412 5.02 1566 1.2741 0.4667
1.4469 6.02 1827 2.1842 0.4667
1.2686 7.02 2088 1.8709 0.4667
1.4495 8.02 2349 1.8210 0.4667
0.7905 9.02 2610 0.6076 0.7556
0.8322 10.02 2871 0.9286 0.7111
0.8618 11.02 3132 0.9191 0.7111
1.1268 12.02 3393 1.7668 0.5778
0.7087 13.02 3654 1.8699 0.5333
1.5327 14.02 3915 1.4736 0.5556
0.531 15.02 4176 1.5145 0.6667
1.0064 16.02 4437 1.4191 0.6222
0.2797 17.02 4698 1.3301 0.7333
1.0355 18.02 4959 1.4829 0.7111
0.937 19.02 5220 1.8992 0.5556
0.0065 20.02 5481 1.1564 0.6444
0.9542 21.02 5742 1.4034 0.6889
0.4833 22.02 6003 1.3208 0.7111
1.1719 23.02 6264 1.2117 0.7333
0.9043 24.02 6525 1.0468 0.8
1.1914 25.02 6786 1.3830 0.7111
0.0032 26.02 7047 1.8348 0.7111
0.0022 27.02 7308 1.9530 0.6889
1.1481 28.02 7569 1.4339 0.7111
0.8332 29.02 7830 2.0841 0.6667
0.9692 30.02 8091 1.4363 0.7333
0.0095 31.02 8352 1.6087 0.7778
0.0007 32.02 8613 1.1479 0.8222
0.534 33.02 8874 1.7736 0.6667
1.149 34.02 9135 1.3923 0.8
0.0015 35.02 9396 1.7895 0.6889
0.0007 36.02 9657 1.6939 0.7556
0.6659 37.02 9918 1.9130 0.6889
0.2733 38.02 10179 1.8643 0.6667
0.3147 39.02 10440 1.8536 0.6889
0.0003 40.02 10701 1.6605 0.7778
0.4885 41.02 10962 1.7213 0.7333
0.0086 42.02 11223 1.9743 0.7111
0.0004 43.02 11484 1.5626 0.7556
0.0005 44.02 11745 1.5694 0.7333
0.0003 45.02 12006 1.6538 0.7778
0.0002 46.02 12267 1.8243 0.7111
0.0001 47.02 12528 1.7648 0.7556
0.0002 48.02 12789 1.6705 0.7333
0.0008 49.02 13050 1.6519 0.7556

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
0
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for beingbatman/CTMAE-P2-V5-3g-S2

Finetuned
(40)
this model