fine-w2v2base-bs16-ep200-lr2e-05-linguistic-rmsnorm-focal_ctc_a0.99_g2-0.05_10_0.004_40-final

This model is a fine-tuned version of nguyenvulebinh/wav2vec2-base-vietnamese-250h on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 4.2098
  • Wer: 0.1111

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • distributed_type: multi-GPU
  • num_devices: 4
  • total_train_batch_size: 64
  • total_eval_batch_size: 32
  • optimizer: Adam with betas=(0.9,0.98) and epsilon=1e-08
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 200

Training results

Training Loss Epoch Step Validation Loss Wer
2070.5209 0.94 50 1050.6107 15.9194
1893.1078 1.89 100 856.5042 15.9084
1402.1547 2.83 150 638.6856 15.9718
1145.3789 3.77 200 541.5690 15.9801
921.5391 4.72 250 388.8102 15.9792
272.1148 5.66 300 86.7826 1.0
111.1403 6.6 350 82.6906 1.0
104.6625 7.55 400 80.7376 1.0
99.9559 8.49 450 79.5480 1.0
99.3013 9.43 500 78.0927 1.0
97.293 10.38 550 76.9956 1.0
98.062 11.32 600 76.4573 1.0
96.0945 12.26 650 75.6026 1.0
95.9684 13.21 700 75.6452 1.0
94.6767 14.15 750 75.9780 1.0
93.8767 15.09 800 77.8212 0.9992
91.9104 16.04 850 78.9036 1.0012
89.6319 16.98 900 74.7778 0.9991
87.5197 17.92 950 73.3647 0.9993
86.9794 18.87 1000 71.1035 0.9998
84.6621 19.81 1050 67.8181 0.9998
81.1323 20.75 1100 53.9551 0.8985
56.2753 21.7 1150 25.1096 0.3755
32.2015 22.64 1200 15.0509 0.2327
22.3559 23.58 1250 11.2544 0.1796
17.766 24.53 1300 9.1760 0.1561
15.3377 25.47 1350 7.9738 0.1502
13.0247 26.42 1400 7.1329 0.1391
12.047 27.36 1450 6.4816 0.1260
10.874 28.3 1500 6.0990 0.1260
10.3489 29.25 1550 6.0334 0.1276
9.5992 30.19 1600 5.6333 0.1204
8.7578 31.13 1650 5.4704 0.1115
8.8291 32.08 1700 5.2070 0.1063
8.1346 33.02 1750 5.1131 0.1092
7.7698 33.96 1800 4.9853 0.1059
7.2385 34.91 1850 4.9884 0.1092
7.2942 35.85 1900 4.9169 0.1004
7.1231 36.79 1950 4.7677 0.1009
6.6689 37.74 2000 4.8707 0.1078
6.6686 38.68 2050 4.6952 0.1023
6.3965 39.62 2100 4.9130 0.1065
6.2281 40.57 2150 4.6463 0.0982
5.8648 41.51 2200 4.8060 0.1083
5.8669 42.45 2250 4.7226 0.1088
5.4889 43.4 2300 4.6982 0.1104
5.5636 44.34 2350 4.6289 0.1089
5.512 45.28 2400 4.4615 0.1035
5.1006 46.23 2450 4.4759 0.0973
5.04 47.17 2500 4.4644 0.1072
4.7533 48.11 2550 4.3047 0.1011
4.811 49.06 2600 4.3995 0.0978
4.4865 50.0 2650 4.2904 0.0945
4.41 50.94 2700 4.2735 0.0919
4.6938 51.89 2750 4.2735 0.0929
4.2775 52.83 2800 4.3565 0.0944
4.4868 53.77 2850 4.3067 0.0936
4.3502 54.72 2900 4.3263 0.1015
3.9422 55.66 2950 4.1456 0.0986
3.83 56.6 3000 4.1247 0.0994
4.0432 57.55 3050 4.1449 0.0943
3.9007 58.49 3100 4.2760 0.1001
3.7194 59.43 3150 4.1489 0.0938
3.791 60.38 3200 4.1865 0.0952
3.439 61.32 3250 4.0903 0.0978
3.666 62.26 3300 4.1479 0.1019
3.3243 63.21 3350 4.0614 0.1013
3.389 64.15 3400 4.0781 0.0987
3.3151 65.09 3450 4.2045 0.1063
3.6432 66.04 3500 4.2502 0.1057
3.3547 66.98 3550 4.0707 0.0946
3.323 67.92 3600 4.1075 0.0962
3.1881 68.87 3650 4.1951 0.0992
3.2008 69.81 3700 4.1416 0.0945
3.079 70.75 3750 4.1982 0.0923
3.0741 71.7 3800 4.2177 0.0985
2.9199 72.64 3850 4.2224 0.0969
2.9009 73.58 3900 4.1863 0.0956
2.6505 74.53 3950 4.1560 0.0987
2.9569 75.47 4000 4.1147 0.0888
2.7948 76.42 4050 4.2427 0.1057
2.9366 77.36 4100 4.3038 0.1091
2.9399 78.3 4150 4.2281 0.1020
2.5798 79.25 4200 4.2448 0.0980
2.715 80.19 4250 4.1647 0.0931
2.615 81.13 4300 4.1305 0.0952
2.6131 82.08 4350 4.2630 0.0984
2.5931 83.02 4400 4.1665 0.1034
2.4909 83.96 4450 4.1648 0.0947
2.5452 84.91 4500 4.1319 0.1029
2.3713 85.85 4550 4.0906 0.1014
2.452 86.79 4600 4.0809 0.0968
2.3391 87.74 4650 4.1726 0.0990
2.3136 88.68 4700 4.1336 0.0933
2.2644 89.62 4750 4.1530 0.1041
2.0899 90.57 4800 4.2035 0.1102
2.4311 91.51 4850 4.1507 0.0989
1.9583 92.45 4900 4.2440 0.0996
2.4467 93.4 4950 4.1794 0.1077
2.1111 94.34 5000 4.1224 0.0926
2.0238 95.28 5050 4.1248 0.0948
2.1593 96.23 5100 4.2034 0.1085
2.033 97.17 5150 4.1157 0.1119
2.0795 98.11 5200 4.1638 0.1004
2.0027 99.06 5250 4.1367 0.1029
2.0702 100.0 5300 4.1131 0.0993
2.0022 100.94 5350 4.0984 0.1034
2.0313 101.89 5400 4.1044 0.0979
2.0468 102.83 5450 4.1019 0.0982
1.9196 103.77 5500 4.1935 0.1070
1.8988 104.72 5550 4.1279 0.1032
1.9784 105.66 5600 4.1553 0.1068
2.0349 106.6 5650 4.1259 0.1060
1.6378 107.55 5700 4.1543 0.1056
1.7948 108.49 5750 4.1599 0.1122
1.8042 109.43 5800 4.1429 0.1113
1.7872 110.38 5850 4.1495 0.1032
1.8428 111.32 5900 4.1143 0.1151
1.8995 112.26 5950 4.1219 0.1019
1.7064 113.21 6000 4.1017 0.1115
1.5617 114.15 6050 4.0737 0.1088
1.7554 115.09 6100 4.1050 0.1048
1.7072 116.04 6150 4.1199 0.1077
1.6821 116.98 6200 4.1431 0.1037
1.6876 117.92 6250 4.1442 0.1074
1.6461 118.87 6300 4.1750 0.1019
1.5313 119.81 6350 4.1441 0.1092
1.7041 120.75 6400 4.1632 0.1087
1.6251 121.7 6450 4.1980 0.1094
1.6317 122.64 6500 4.1192 0.1034
1.5896 123.58 6550 4.1356 0.1121
1.5714 124.53 6600 4.1736 0.1090
1.3745 125.47 6650 4.2218 0.1094
1.7257 126.42 6700 4.2172 0.1138
1.524 127.36 6750 4.1964 0.1099
1.4954 128.3 6800 4.2411 0.1101
1.5402 129.25 6850 4.1481 0.1079
1.5668 130.19 6900 4.1864 0.1081
1.5251 131.13 6950 4.1792 0.1161
1.6132 132.08 7000 4.1093 0.1094
1.6573 133.02 7050 4.1153 0.1122
1.5327 133.96 7100 4.1231 0.1129
1.5617 134.91 7150 4.1707 0.1200
1.5798 135.85 7200 4.1301 0.1141
1.5294 136.79 7250 4.1376 0.1149
1.4742 137.74 7300 4.1316 0.1149
1.569 138.68 7350 4.1947 0.1154
1.5434 139.62 7400 4.1617 0.1130
1.4833 140.57 7450 4.1586 0.1187
1.3112 141.51 7500 4.1543 0.1125
1.4757 142.45 7550 4.1885 0.1127
1.4602 143.4 7600 4.1938 0.1185
1.3891 144.34 7650 4.2258 0.1134
1.5484 145.28 7700 4.2443 0.1130
1.3533 146.23 7750 4.2355 0.1064
1.3938 147.17 7800 4.2510 0.1087
1.422 148.11 7850 4.2208 0.1174
1.2897 149.06 7900 4.2606 0.1180
1.4107 150.0 7950 4.2759 0.1113
1.3735 150.94 8000 4.2398 0.1098
1.4142 151.89 8050 4.2370 0.1080
1.3136 152.83 8100 4.2353 0.1061
1.4554 153.77 8150 4.2255 0.1090
1.4135 154.72 8200 4.2362 0.1107
1.3512 155.66 8250 4.2431 0.1099
1.3081 156.6 8300 4.2480 0.1097
1.2292 157.55 8350 4.2302 0.1101
1.3 158.49 8400 4.2558 0.1124
1.368 159.43 8450 4.2727 0.1082
1.3324 160.38 8500 4.2577 0.1121
1.293 161.32 8550 4.2435 0.1153
1.2726 162.26 8600 4.2194 0.1146
1.3561 163.21 8650 4.2485 0.1170
1.2194 164.15 8700 4.2325 0.1115
1.3088 165.09 8750 4.2530 0.1121
1.3285 166.04 8800 4.2556 0.1116
1.2224 166.98 8850 4.2561 0.1098
1.3535 167.92 8900 4.2463 0.1108
1.2354 168.87 8950 4.2457 0.1073
1.2799 169.81 9000 4.2256 0.1098
1.2153 170.75 9050 4.2130 0.1088
1.1879 171.7 9100 4.1974 0.1087
1.2708 172.64 9150 4.2232 0.1133
1.3335 173.58 9200 4.2444 0.1118
1.3543 174.53 9250 4.2460 0.1142
1.3021 175.47 9300 4.2073 0.1104
1.2694 176.42 9350 4.2009 0.1106
1.3015 177.36 9400 4.2318 0.1126
1.2935 178.3 9450 4.2460 0.1142
1.2766 179.25 9500 4.2334 0.1134
1.1748 180.19 9550 4.2197 0.1119
1.2498 181.13 9600 4.2149 0.1107
1.2658 182.08 9650 4.2115 0.1126
1.3142 183.02 9700 4.2067 0.1107
1.2422 183.96 9750 4.2044 0.1123
1.2152 184.91 9800 4.2051 0.1130
1.2157 185.85 9850 4.2080 0.1132
1.1727 186.79 9900 4.2041 0.1104
1.2594 187.74 9950 4.2049 0.1115
1.3206 188.68 10000 4.2014 0.1115
1.1332 189.62 10050 4.2047 0.1114
1.2477 190.57 10100 4.2078 0.1115
1.2712 191.51 10150 4.2069 0.1117
1.1063 192.45 10200 4.2073 0.1119
1.3181 193.4 10250 4.2094 0.1109
1.1348 194.34 10300 4.2090 0.1114
1.224 195.28 10350 4.2065 0.1114
1.242 196.23 10400 4.2089 0.1112
1.1683 197.17 10450 4.2100 0.1113
1.2693 198.11 10500 4.2081 0.1109
1.3093 199.06 10550 4.2092 0.1109
1.229 200.0 10600 4.2098 0.1111

Framework versions

  • Transformers 4.34.0
  • Pytorch 2.0.1
  • Datasets 2.14.5
  • Tokenizers 0.14.1
Downloads last month
2
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and HF Inference API was unable to determine this model’s pipeline type.

Model tree for tuanio/fine-w2v2base-bs16-ep200-lr2e-05-linguistic-rmsnorm-focal_ctc_a0.99_g2-0.05_10_0.004_40-final

Finetuned
(56)
this model