File size: 7,952 Bytes
8e28465
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
---
library_name: transformers
license: cc-by-nc-4.0
base_model: facebook/timesformer-base-finetuned-k400
tags:
- generated_from_trainer
metrics:
- accuracy
- precision
- recall
- f1
model-index:
- name: Timesformer_WLASL_100_200_epochs_p20_SR_16
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# Timesformer_WLASL_100_200_epochs_p20_SR_16

This model is a fine-tuned version of [facebook/timesformer-base-finetuned-k400](https://huggingface.co/facebook/timesformer-base-finetuned-k400) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 2.2599
- Top 1 Accuracy: 0.5828
- Top 5 Accuracy: 0.7899
- Top 10 Accuracy: 0.8698
- Accuracy: 0.5828
- Precision: 0.5806
- Recall: 0.5828
- F1: 0.5510

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 8
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 36000
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch   | Step | Validation Loss | Top 1 Accuracy | Top 5 Accuracy | Top 10 Accuracy | Accuracy | Precision | Recall | F1     |
|:-------------:|:-------:|:----:|:---------------:|:--------------:|:--------------:|:---------------:|:--------:|:---------:|:------:|:------:|
| 19.1155       | 0.005   | 180  | 4.6927          | 0.0089         | 0.0414         | 0.0888          | 0.0089   | 0.0155    | 0.0089 | 0.0105 |
| 18.5538       | 1.0050  | 360  | 4.5821          | 0.0266         | 0.0769         | 0.1302          | 0.0266   | 0.0137    | 0.0266 | 0.0116 |
| 17.5848       | 2.0050  | 540  | 4.3988          | 0.0562         | 0.1450         | 0.2633          | 0.0562   | 0.0486    | 0.0562 | 0.0390 |
| 15.8283       | 3.0050  | 721  | 4.0516          | 0.1302         | 0.2959         | 0.4645          | 0.1302   | 0.1012    | 0.1302 | 0.0976 |
| 13.3102       | 4.005   | 901  | 3.6150          | 0.2249         | 0.4704         | 0.6154          | 0.2249   | 0.1781    | 0.2249 | 0.1741 |
| 11.2113       | 5.0050  | 1081 | 3.2389          | 0.2604         | 0.6065         | 0.7367          | 0.2604   | 0.2422    | 0.2604 | 0.2215 |
| 8.898         | 6.0050  | 1261 | 2.8714          | 0.3757         | 0.6775         | 0.8166          | 0.3757   | 0.3584    | 0.3757 | 0.3324 |
| 6.715         | 7.0050  | 1442 | 2.6518          | 0.4231         | 0.7249         | 0.8402          | 0.4231   | 0.3828    | 0.4231 | 0.3730 |
| 4.8442        | 8.005   | 1622 | 2.3294          | 0.4645         | 0.7929         | 0.8876          | 0.4645   | 0.5077    | 0.4645 | 0.4377 |
| 3.3825        | 9.0050  | 1802 | 2.1747          | 0.4911         | 0.7899         | 0.8964          | 0.4911   | 0.5436    | 0.4911 | 0.4654 |
| 2.0471        | 10.0050 | 1982 | 1.9990          | 0.5148         | 0.8107         | 0.9053          | 0.5178   | 0.5871    | 0.5178 | 0.5057 |
| 1.3242        | 11.0050 | 2163 | 1.8964          | 0.5473         | 0.8166         | 0.8935          | 0.5473   | 0.5822    | 0.5473 | 0.5199 |
| 0.8746        | 12.005  | 2343 | 1.8222          | 0.5562         | 0.8254         | 0.9083          | 0.5562   | 0.5796    | 0.5562 | 0.5320 |
| 0.5537        | 13.0050 | 2523 | 1.7525          | 0.5769         | 0.8343         | 0.9142          | 0.5769   | 0.5813    | 0.5769 | 0.5468 |
| 0.4081        | 14.0050 | 2703 | 1.7351          | 0.5947         | 0.8136         | 0.8964          | 0.5947   | 0.6684    | 0.5947 | 0.5834 |
| 0.17          | 15.0050 | 2884 | 1.6998          | 0.5592         | 0.8225         | 0.9083          | 0.5592   | 0.5763    | 0.5592 | 0.5342 |
| 0.2053        | 16.005  | 3064 | 1.7340          | 0.5651         | 0.8343         | 0.9083          | 0.5651   | 0.6215    | 0.5651 | 0.5390 |
| 0.1434        | 17.0050 | 3244 | 1.7350          | 0.6006         | 0.8432         | 0.9142          | 0.6006   | 0.6347    | 0.6006 | 0.5806 |
| 0.1957        | 18.0050 | 3424 | 1.8179          | 0.5621         | 0.8373         | 0.9142          | 0.5621   | 0.6060    | 0.5621 | 0.5350 |
| 0.1636        | 19.0050 | 3605 | 1.7831          | 0.6154         | 0.8225         | 0.8905          | 0.6154   | 0.6401    | 0.6154 | 0.5917 |
| 0.0908        | 20.005  | 3785 | 1.7552          | 0.6213         | 0.8402         | 0.9053          | 0.6213   | 0.6504    | 0.6213 | 0.6014 |
| 0.058         | 21.0050 | 3965 | 1.8422          | 0.6243         | 0.8254         | 0.9112          | 0.6213   | 0.6392    | 0.6213 | 0.5962 |
| 0.0924        | 22.0050 | 4145 | 1.8347          | 0.6006         | 0.8225         | 0.9201          | 0.6006   | 0.6218    | 0.6006 | 0.5735 |
| 0.0799        | 23.0050 | 4326 | 1.9650          | 0.6036         | 0.8107         | 0.8846          | 0.6036   | 0.6182    | 0.6036 | 0.5724 |
| 0.176         | 24.005  | 4506 | 1.9326          | 0.5858         | 0.8402         | 0.9142          | 0.5858   | 0.6240    | 0.5858 | 0.5671 |
| 0.0786        | 25.0050 | 4686 | 1.7753          | 0.6124         | 0.8491         | 0.9142          | 0.6124   | 0.6607    | 0.6124 | 0.5998 |
| 0.242         | 26.0050 | 4866 | 2.0219          | 0.5769         | 0.7722         | 0.8876          | 0.5769   | 0.6337    | 0.5769 | 0.5552 |
| 0.1767        | 27.0050 | 5047 | 1.9744          | 0.5828         | 0.8166         | 0.9024          | 0.5828   | 0.6330    | 0.5828 | 0.5721 |
| 0.14          | 28.005  | 5227 | 2.1996          | 0.5769         | 0.7811         | 0.8609          | 0.5769   | 0.5983    | 0.5769 | 0.5430 |
| 0.104         | 29.0050 | 5407 | 2.0881          | 0.5769         | 0.8166         | 0.8876          | 0.5769   | 0.6146    | 0.5769 | 0.5641 |
| 0.1454        | 30.0050 | 5587 | 2.3394          | 0.5621         | 0.7959         | 0.8905          | 0.5621   | 0.6280    | 0.5621 | 0.5448 |
| 0.2221        | 31.0050 | 5768 | 1.9360          | 0.5947         | 0.8225         | 0.9024          | 0.5947   | 0.6606    | 0.5947 | 0.5881 |
| 0.1026        | 32.005  | 5948 | 2.0920          | 0.6036         | 0.8107         | 0.8935          | 0.6036   | 0.6376    | 0.6036 | 0.5832 |
| 0.0968        | 33.0050 | 6128 | 2.2746          | 0.5740         | 0.8047         | 0.8846          | 0.5740   | 0.6308    | 0.5740 | 0.5542 |
| 0.1864        | 34.0050 | 6308 | 2.2081          | 0.5888         | 0.8047         | 0.8698          | 0.5888   | 0.6394    | 0.5888 | 0.5704 |
| 0.1353        | 35.0050 | 6489 | 2.1853          | 0.5799         | 0.8254         | 0.8935          | 0.5799   | 0.6133    | 0.5799 | 0.5636 |
| 0.1618        | 36.005  | 6669 | 2.2661          | 0.5710         | 0.7959         | 0.8698          | 0.5710   | 0.6243    | 0.5710 | 0.5515 |
| 0.259         | 37.0050 | 6849 | 2.3163          | 0.5740         | 0.7870         | 0.8580          | 0.5740   | 0.6088    | 0.5740 | 0.5459 |
| 0.3394        | 38.0050 | 7029 | 2.0984          | 0.5769         | 0.7988         | 0.8905          | 0.5769   | 0.6154    | 0.5769 | 0.5614 |
| 0.0833        | 39.0050 | 7210 | 2.2811          | 0.5533         | 0.8047         | 0.8698          | 0.5533   | 0.6051    | 0.5533 | 0.5328 |
| 0.1259        | 40.005  | 7390 | 2.2599          | 0.5828         | 0.7899         | 0.8698          | 0.5828   | 0.5806    | 0.5828 | 0.5510 |


### Framework versions

- Transformers 4.46.1
- Pytorch 2.5.1+cu124
- Datasets 3.1.0
- Tokenizers 0.20.1