shevek commited on
Commit
187e9db
·
verified ·
1 Parent(s): 9d59ae0

End of training

Browse files
README.md CHANGED
@@ -4,8 +4,6 @@ license: apache-2.0
4
  base_model: facebook/wav2vec2-base
5
  tags:
6
  - generated_from_trainer
7
- metrics:
8
- - accuracy
9
  model-index:
10
  - name: my_awesome_speach_model
11
  results: []
@@ -18,8 +16,13 @@ should probably proofread and complete it, then remove this comment. -->
18
 
19
  This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
20
  It achieves the following results on the evaluation set:
21
- - Loss: 1.2361
22
- - Accuracy: 0.6610
 
 
 
 
 
23
 
24
  ## Model description
25
 
@@ -39,118 +42,19 @@ More information needed
39
 
40
  The following hyperparameters were used during training:
41
  - learning_rate: 3e-05
42
- - train_batch_size: 32
43
- - eval_batch_size: 32
44
  - seed: 42
45
  - gradient_accumulation_steps: 4
46
- - total_train_batch_size: 128
47
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
48
  - lr_scheduler_type: linear
49
  - lr_scheduler_warmup_ratio: 0.1
50
  - num_epochs: 100
51
 
52
- ### Training results
53
-
54
- | Training Loss | Epoch | Step | Validation Loss | Accuracy |
55
- |:-------------:|:-------:|:----:|:---------------:|:--------:|
56
- | No log | 0.8 | 3 | 2.1052 | 0.0847 |
57
- | No log | 1.7333 | 6 | 2.0712 | 0.1610 |
58
- | No log | 2.6667 | 9 | 2.0195 | 0.6356 |
59
- | 2.0648 | 3.8667 | 13 | 1.9087 | 0.6864 |
60
- | 2.0648 | 4.8 | 16 | 1.8101 | 0.6864 |
61
- | 2.0648 | 5.7333 | 19 | 1.7046 | 0.6864 |
62
- | 1.8256 | 6.6667 | 22 | 1.5664 | 0.6864 |
63
- | 1.8256 | 7.8667 | 26 | 1.4010 | 0.6864 |
64
- | 1.8256 | 8.8 | 29 | 1.3322 | 0.6864 |
65
- | 1.4713 | 9.7333 | 32 | 1.2896 | 0.6864 |
66
- | 1.4713 | 10.6667 | 35 | 1.2617 | 0.6864 |
67
- | 1.4713 | 11.8667 | 39 | 1.2342 | 0.6864 |
68
- | 1.307 | 12.8 | 42 | 1.2206 | 0.6864 |
69
- | 1.307 | 13.7333 | 45 | 1.2094 | 0.6864 |
70
- | 1.307 | 14.6667 | 48 | 1.1998 | 0.6864 |
71
- | 1.2241 | 15.8667 | 52 | 1.1920 | 0.6864 |
72
- | 1.2241 | 16.8 | 55 | 1.1868 | 0.6864 |
73
- | 1.2241 | 17.7333 | 58 | 1.1850 | 0.6864 |
74
- | 1.2053 | 18.6667 | 61 | 1.2018 | 0.6864 |
75
- | 1.2053 | 19.8667 | 65 | 1.1801 | 0.6864 |
76
- | 1.2053 | 20.8 | 68 | 1.1851 | 0.6864 |
77
- | 1.1815 | 21.7333 | 71 | 1.1699 | 0.6864 |
78
- | 1.1815 | 22.6667 | 74 | 1.1746 | 0.6864 |
79
- | 1.1815 | 23.8667 | 78 | 1.2902 | 0.6864 |
80
- | 1.1471 | 24.8 | 81 | 1.1601 | 0.6864 |
81
- | 1.1471 | 25.7333 | 84 | 1.1527 | 0.6864 |
82
- | 1.1471 | 26.6667 | 87 | 1.1841 | 0.6864 |
83
- | 1.1109 | 27.8667 | 91 | 1.1406 | 0.6864 |
84
- | 1.1109 | 28.8 | 94 | 1.1454 | 0.6949 |
85
- | 1.1109 | 29.7333 | 97 | 1.2087 | 0.6525 |
86
- | 1.0994 | 30.6667 | 100 | 1.1712 | 0.6949 |
87
- | 1.0994 | 31.8667 | 104 | 1.1769 | 0.7034 |
88
- | 1.0994 | 32.8 | 107 | 1.1852 | 0.6949 |
89
- | 1.0516 | 33.7333 | 110 | 1.2119 | 0.6780 |
90
- | 1.0516 | 34.6667 | 113 | 1.1934 | 0.6949 |
91
- | 1.0516 | 35.8667 | 117 | 1.2235 | 0.6610 |
92
- | 1.0547 | 36.8 | 120 | 1.1929 | 0.6780 |
93
- | 1.0547 | 37.7333 | 123 | 1.1711 | 0.6780 |
94
- | 1.0547 | 38.6667 | 126 | 1.1893 | 0.6864 |
95
- | 0.9975 | 39.8667 | 130 | 1.1604 | 0.6864 |
96
- | 0.9975 | 40.8 | 133 | 1.1802 | 0.6864 |
97
- | 0.9975 | 41.7333 | 136 | 1.1613 | 0.6864 |
98
- | 0.9975 | 42.6667 | 139 | 1.1852 | 0.6780 |
99
- | 0.9829 | 43.8667 | 143 | 1.1511 | 0.7119 |
100
- | 0.9829 | 44.8 | 146 | 1.2872 | 0.6356 |
101
- | 0.9829 | 45.7333 | 149 | 1.1891 | 0.6864 |
102
- | 1.0212 | 46.6667 | 152 | 1.1853 | 0.6780 |
103
- | 1.0212 | 47.8667 | 156 | 1.3700 | 0.6017 |
104
- | 1.0212 | 48.8 | 159 | 1.2899 | 0.6271 |
105
- | 1.012 | 49.7333 | 162 | 1.2226 | 0.6695 |
106
- | 1.012 | 50.6667 | 165 | 1.2168 | 0.6695 |
107
- | 1.012 | 51.8667 | 169 | 1.2985 | 0.6356 |
108
- | 1.0166 | 52.8 | 172 | 1.2924 | 0.6441 |
109
- | 1.0166 | 53.7333 | 175 | 1.2145 | 0.6525 |
110
- | 1.0166 | 54.6667 | 178 | 1.2080 | 0.6695 |
111
- | 0.9709 | 55.8667 | 182 | 1.3386 | 0.6356 |
112
- | 0.9709 | 56.8 | 185 | 1.2637 | 0.6610 |
113
- | 0.9709 | 57.7333 | 188 | 1.1988 | 0.6949 |
114
- | 0.9882 | 58.6667 | 191 | 1.2233 | 0.6610 |
115
- | 0.9882 | 59.8667 | 195 | 1.3560 | 0.6441 |
116
- | 0.9882 | 60.8 | 198 | 1.3280 | 0.6441 |
117
- | 0.9324 | 61.7333 | 201 | 1.2938 | 0.6271 |
118
- | 0.9324 | 62.6667 | 204 | 1.2439 | 0.6610 |
119
- | 0.9324 | 63.8667 | 208 | 1.3100 | 0.6271 |
120
- | 0.9331 | 64.8 | 211 | 1.3142 | 0.6356 |
121
- | 0.9331 | 65.7333 | 214 | 1.2808 | 0.6525 |
122
- | 0.9331 | 66.6667 | 217 | 1.2599 | 0.6525 |
123
- | 0.9155 | 67.8667 | 221 | 1.2801 | 0.6525 |
124
- | 0.9155 | 68.8 | 224 | 1.2173 | 0.6864 |
125
- | 0.9155 | 69.7333 | 227 | 1.2677 | 0.6525 |
126
- | 0.88 | 70.6667 | 230 | 1.2324 | 0.6780 |
127
- | 0.88 | 71.8667 | 234 | 1.1966 | 0.6780 |
128
- | 0.88 | 72.8 | 237 | 1.2495 | 0.6695 |
129
- | 0.9119 | 73.7333 | 240 | 1.2212 | 0.6695 |
130
- | 0.9119 | 74.6667 | 243 | 1.2157 | 0.6695 |
131
- | 0.9119 | 75.8667 | 247 | 1.2324 | 0.6610 |
132
- | 0.8721 | 76.8 | 250 | 1.2343 | 0.6695 |
133
- | 0.8721 | 77.7333 | 253 | 1.2306 | 0.6610 |
134
- | 0.8721 | 78.6667 | 256 | 1.2322 | 0.6610 |
135
- | 0.8741 | 79.8667 | 260 | 1.2413 | 0.6695 |
136
- | 0.8741 | 80.8 | 263 | 1.2184 | 0.6949 |
137
- | 0.8741 | 81.7333 | 266 | 1.2102 | 0.6864 |
138
- | 0.8741 | 82.6667 | 269 | 1.2311 | 0.6780 |
139
- | 0.8509 | 83.8667 | 273 | 1.2596 | 0.6525 |
140
- | 0.8509 | 84.8 | 276 | 1.2589 | 0.6525 |
141
- | 0.8509 | 85.7333 | 279 | 1.2425 | 0.6695 |
142
- | 0.8614 | 86.6667 | 282 | 1.2361 | 0.6695 |
143
- | 0.8614 | 87.8667 | 286 | 1.2317 | 0.6695 |
144
- | 0.8614 | 88.8 | 289 | 1.2295 | 0.6695 |
145
- | 0.8762 | 89.7333 | 292 | 1.2310 | 0.6695 |
146
- | 0.8762 | 90.6667 | 295 | 1.2352 | 0.6695 |
147
- | 0.8762 | 91.8667 | 299 | 1.2362 | 0.6610 |
148
- | 0.8594 | 92.2667 | 300 | 1.2361 | 0.6610 |
149
-
150
-
151
  ### Framework versions
152
 
153
- - Transformers 4.46.2
154
  - Pytorch 2.5.1+cu121
155
- - Datasets 3.1.0
156
  - Tokenizers 0.20.3
 
4
  base_model: facebook/wav2vec2-base
5
  tags:
6
  - generated_from_trainer
 
 
7
  model-index:
8
  - name: my_awesome_speach_model
9
  results: []
 
16
 
17
  This model is a fine-tuned version of [facebook/wav2vec2-base](https://huggingface.co/facebook/wav2vec2-base) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - eval_loss: 0.9166
20
+ - eval_accuracy: 0.36
21
+ - eval_runtime: 0.2891
22
+ - eval_samples_per_second: 86.476
23
+ - eval_steps_per_second: 24.213
24
+ - epoch: 27.0
25
+ - step: 162
26
 
27
  ## Model description
28
 
 
42
 
43
  The following hyperparameters were used during training:
44
  - learning_rate: 3e-05
45
+ - train_batch_size: 4
46
+ - eval_batch_size: 4
47
  - seed: 42
48
  - gradient_accumulation_steps: 4
49
+ - total_train_batch_size: 16
50
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
51
  - lr_scheduler_type: linear
52
  - lr_scheduler_warmup_ratio: 0.1
53
  - num_epochs: 100
54
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
55
  ### Framework versions
56
 
57
+ - Transformers 4.46.3
58
  - Pytorch 2.5.1+cu121
59
+ - Datasets 3.2.0
60
  - Tokenizers 0.20.3
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:7ab2a12c47d5a976384ae4a5bb479ba2946143c2f58f6f2ac6642a7522e5d8e9
3
  size 378302360
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:2842c4dcdff55d4108aa78f38c132afb493eca2c07418872a59dac5ae77c3aca
3
  size 378302360
runs/Dec15_06-36-53_4a551e90a1ca/events.out.tfevents.1734244613.4a551e90a1ca.2138.1 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:2b470bb11e926e84f178101b4fdf2d45c01e7ccf1149b959a5029d8ca340e7cb
3
- size 17439
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:728a69a83948858fbedcd22a47ff9528d501a11be087b1235f6067f45df963bc
3
+ size 18830