File size: 4,850 Bytes
9af0564
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
---

license: cc-by-nc-4.0
base_model: facebook/timesformer-base-finetuned-k400
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: tsf-gs-rots-wtoken-DRPT0.3-r128-f150-6.6-h768-i3072-p32-b8-e50
  results: []
---


<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# tsf-gs-rots-wtoken-DRPT0.3-r128-f150-6.6-h768-i3072-p32-b8-e50

This model is a fine-tuned version of [facebook/timesformer-base-finetuned-k400](https://huggingface.co/facebook/timesformer-base-finetuned-k400) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.7417
- Accuracy: 0.6150

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05

- train_batch_size: 8

- eval_batch_size: 8

- seed: 42

- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08

- lr_scheduler_type: linear

- lr_scheduler_warmup_ratio: 0.1
- training_steps: 5400

- mixed_precision_training: Native AMP



### Training results



| Training Loss | Epoch   | Step | Validation Loss | Accuracy |

|:-------------:|:-------:|:----:|:---------------:|:--------:|

| 1.1273        | 0.0202  | 109  | 1.1116          | 0.3262   |

| 1.1488        | 1.0202  | 218  | 1.0997          | 0.3369   |

| 1.1365        | 2.0202  | 327  | 1.1016          | 0.3369   |

| 1.1232        | 3.0202  | 436  | 1.1009          | 0.3369   |

| 1.0927        | 4.0202  | 545  | 1.1017          | 0.3369   |

| 1.0807        | 5.0202  | 654  | 1.1055          | 0.3262   |

| 1.1257        | 6.0202  | 763  | 1.1005          | 0.3262   |

| 1.0961        | 7.0202  | 872  | 1.0999          | 0.3369   |

| 1.1192        | 8.0202  | 981  | 1.0997          | 0.3262   |

| 1.1039        | 9.0202  | 1090 | 1.0982          | 0.3369   |

| 1.1047        | 10.0202 | 1199 | 1.0988          | 0.3369   |

| 1.0662        | 11.0202 | 1308 | 1.3609          | 0.3743   |

| 1.197         | 12.0202 | 1417 | 1.1068          | 0.3369   |

| 1.1331        | 13.0202 | 1526 | 1.1085          | 0.3422   |

| 1.1174        | 14.0202 | 1635 | 1.1129          | 0.3262   |

| 1.0838        | 15.0202 | 1744 | 1.0893          | 0.4011   |

| 1.0943        | 16.0202 | 1853 | 1.0385          | 0.3904   |

| 1.0989        | 17.0202 | 1962 | 1.0832          | 0.5027   |

| 1.0326        | 18.0202 | 2071 | 0.9577          | 0.4813   |

| 1.0394        | 19.0202 | 2180 | 0.9385          | 0.6043   |

| 0.9952        | 20.0202 | 2289 | 0.8765          | 0.6096   |

| 0.9504        | 21.0202 | 2398 | 0.8307          | 0.6096   |

| 0.9256        | 22.0202 | 2507 | 0.8004          | 0.6471   |

| 0.8924        | 23.0202 | 2616 | 0.9152          | 0.5989   |

| 0.9158        | 24.0202 | 2725 | 0.7679          | 0.6952   |

| 0.8838        | 25.0202 | 2834 | 0.7533          | 0.6952   |

| 1.0359        | 26.0202 | 2943 | 0.7408          | 0.6845   |

| 0.8345        | 27.0202 | 3052 | 0.7069          | 0.7112   |

| 0.8803        | 28.0202 | 3161 | 0.7740          | 0.6684   |

| 0.7475        | 29.0202 | 3270 | 0.6999          | 0.7112   |

| 0.5596        | 30.0202 | 3379 | 0.8609          | 0.6364   |

| 0.8362        | 31.0202 | 3488 | 1.5082          | 0.4813   |

| 0.672         | 32.0202 | 3597 | 0.7459          | 0.7059   |

| 0.6874        | 33.0202 | 3706 | 0.9255          | 0.6845   |

| 0.6259        | 34.0202 | 3815 | 0.8475          | 0.6364   |

| 0.6356        | 35.0202 | 3924 | 0.8400          | 0.6791   |

| 0.6482        | 36.0202 | 4033 | 0.8579          | 0.6310   |

| 0.5495        | 37.0202 | 4142 | 1.5998          | 0.5241   |

| 0.6663        | 38.0202 | 4251 | 0.7969          | 0.7112   |

| 0.6363        | 39.0202 | 4360 | 1.1134          | 0.6845   |

| 0.6794        | 40.0202 | 4469 | 0.9227          | 0.6952   |

| 0.6632        | 41.0202 | 4578 | 1.1304          | 0.6631   |

| 0.7225        | 42.0202 | 4687 | 0.9182          | 0.7112   |

| 0.6032        | 43.0202 | 4796 | 1.2193          | 0.6578   |

| 0.5534        | 44.0202 | 4905 | 1.4561          | 0.6524   |

| 0.4216        | 45.0202 | 5014 | 1.3694          | 0.6364   |

| 0.6082        | 46.0202 | 5123 | 1.5731          | 0.5989   |

| 0.7025        | 47.0202 | 5232 | 1.8556          | 0.6257   |

| 0.4275        | 48.0202 | 5341 | 1.7699          | 0.6257   |

| 0.462         | 49.0109 | 5400 | 1.7417          | 0.6150   |





### Framework versions



- Transformers 4.41.2

- Pytorch 1.13.0+cu117

- Datasets 2.20.0

- Tokenizers 0.19.1