agentlans commited on
Commit
7854217
1 Parent(s): 44ee773

Upload 8 files

Browse files
README.md CHANGED
@@ -1,26 +1,22 @@
1
  ---
2
  library_name: transformers
3
  base_model: agentlans/multilingual-e5-small-aligned
4
- language:
5
- - multilingual
6
  tags:
7
  - generated_from_trainer
8
  model-index:
9
- - name: multilingual-e5-small-aligned-transformed-readability
10
  results: []
11
- datasets:
12
- - agentlans/en-translations
13
  ---
14
 
15
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
16
  should probably proofread and complete it, then remove this comment. -->
17
 
18
- # multilingual-e5-small-aligned-transformed-readability
19
 
20
  This model is a fine-tuned version of [agentlans/multilingual-e5-small-aligned](https://huggingface.co/agentlans/multilingual-e5-small-aligned) on an unknown dataset.
21
  It achieves the following results on the evaluation set:
22
- - Loss: 0.1989
23
- - Mse: 0.1989
24
 
25
  ## Model description
26
 
@@ -40,7 +36,7 @@ More information needed
40
 
41
  The following hyperparameters were used during training:
42
  - learning_rate: 5e-05
43
- - train_batch_size: 32
44
  - eval_batch_size: 8
45
  - seed: 42
46
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
@@ -51,9 +47,9 @@ The following hyperparameters were used during training:
51
 
52
  | Training Loss | Epoch | Step | Validation Loss | Mse |
53
  |:-------------:|:-----:|:-----:|:---------------:|:------:|
54
- | 0.2104 | 1.0 | 27096 | 0.2061 | 0.2061 |
55
- | 0.1718 | 2.0 | 54192 | 0.2066 | 0.2066 |
56
- | 0.141 | 3.0 | 81288 | 0.1989 | 0.1989 |
57
 
58
 
59
  ### Framework versions
@@ -61,4 +57,4 @@ The following hyperparameters were used during training:
61
  - Transformers 4.46.3
62
  - Pytorch 2.5.1+cu124
63
  - Datasets 3.1.0
64
- - Tokenizers 0.20.3
 
1
  ---
2
  library_name: transformers
3
  base_model: agentlans/multilingual-e5-small-aligned
 
 
4
  tags:
5
  - generated_from_trainer
6
  model-index:
7
+ - name: multilingual-e5-small-aligned-readability-20241214-new
8
  results: []
 
 
9
  ---
10
 
11
  <!-- This model card has been generated automatically according to the information the Trainer had access to. You
12
  should probably proofread and complete it, then remove this comment. -->
13
 
14
+ # multilingual-e5-small-aligned-readability-20241214-new
15
 
16
  This model is a fine-tuned version of [agentlans/multilingual-e5-small-aligned](https://huggingface.co/agentlans/multilingual-e5-small-aligned) on an unknown dataset.
17
  It achieves the following results on the evaluation set:
18
+ - Loss: 0.1234
19
+ - Mse: 0.1234
20
 
21
  ## Model description
22
 
 
36
 
37
  The following hyperparameters were used during training:
38
  - learning_rate: 5e-05
39
+ - train_batch_size: 128
40
  - eval_batch_size: 8
41
  - seed: 42
42
  - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
 
47
 
48
  | Training Loss | Epoch | Step | Validation Loss | Mse |
49
  |:-------------:|:-----:|:-----:|:---------------:|:------:|
50
+ | 0.1484 | 1.0 | 7813 | 0.1324 | 0.1324 |
51
+ | 0.1157 | 2.0 | 15626 | 0.1241 | 0.1241 |
52
+ | 0.096 | 3.0 | 23439 | 0.1234 | 0.1234 |
53
 
54
 
55
  ### Framework versions
 
57
  - Transformers 4.46.3
58
  - Pytorch 2.5.1+cu124
59
  - Datasets 3.1.0
60
+ - Tokenizers 0.20.3
all_results.json CHANGED
@@ -1,15 +1,15 @@
1
  {
2
  "epoch": 3.0,
3
- "eval_loss": 0.19889499247074127,
4
- "eval_mse": 0.19889500241300032,
5
- "eval_runtime": 57.9433,
6
- "eval_samples": 96338,
7
- "eval_samples_per_second": 1662.626,
8
- "eval_steps_per_second": 207.841,
9
- "total_flos": 4.283504864539085e+16,
10
- "train_loss": 0.17857247165732115,
11
- "train_runtime": 4471.1905,
12
- "train_samples": 867042,
13
- "train_samples_per_second": 581.752,
14
- "train_steps_per_second": 18.18
15
  }
 
1
  {
2
  "epoch": 3.0,
3
+ "eval_loss": 0.12343376874923706,
4
+ "eval_mse": 0.12343375904401642,
5
+ "eval_runtime": 97.8103,
6
+ "eval_samples": 182111,
7
+ "eval_samples_per_second": 1861.88,
8
+ "eval_steps_per_second": 232.736,
9
+ "total_flos": 4.9403660544e+16,
10
+ "train_loss": 0.12799415017126403,
11
+ "train_runtime": 3304.0151,
12
+ "train_samples": 1000000,
13
+ "train_samples_per_second": 907.986,
14
+ "train_steps_per_second": 7.094
15
  }
eval_results.json CHANGED
@@ -1,9 +1,9 @@
1
  {
2
  "epoch": 3.0,
3
- "eval_loss": 0.19889499247074127,
4
- "eval_mse": 0.19889500241300032,
5
- "eval_runtime": 57.9433,
6
- "eval_samples": 96338,
7
- "eval_samples_per_second": 1662.626,
8
- "eval_steps_per_second": 207.841
9
  }
 
1
  {
2
  "epoch": 3.0,
3
+ "eval_loss": 0.12343376874923706,
4
+ "eval_mse": 0.12343375904401642,
5
+ "eval_runtime": 97.8103,
6
+ "eval_samples": 182111,
7
+ "eval_samples_per_second": 1861.88,
8
+ "eval_steps_per_second": 232.736
9
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:ab2e42c2ac7fbb26e46c050038c5d4e57d29fb8cc3a58380747ebb7b05714bd2
3
  size 470640124
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:b78ef648390bba5e219244fe5581c6eb7febad364f3bbeb8d97598f4a526a064
3
  size 470640124
train_results.json CHANGED
@@ -1,9 +1,9 @@
1
  {
2
  "epoch": 3.0,
3
- "total_flos": 4.283504864539085e+16,
4
- "train_loss": 0.17857247165732115,
5
- "train_runtime": 4471.1905,
6
- "train_samples": 867042,
7
- "train_samples_per_second": 581.752,
8
- "train_steps_per_second": 18.18
9
  }
 
1
  {
2
  "epoch": 3.0,
3
+ "total_flos": 4.9403660544e+16,
4
+ "train_loss": 0.12799415017126403,
5
+ "train_runtime": 3304.0151,
6
+ "train_samples": 1000000,
7
+ "train_samples_per_second": 907.986,
8
+ "train_steps_per_second": 7.094
9
  }
trainer_state.json CHANGED
@@ -1,1186 +1,374 @@
1
  {
2
- "best_metric": 0.19889499247074127,
3
- "best_model_checkpoint": "multilingual-e5-small-aligned-transformed-readability/checkpoint-81288",
4
  "epoch": 3.0,
5
  "eval_steps": 500,
6
- "global_step": 81288,
7
  "is_hyper_param_search": false,
8
  "is_local_process_zero": true,
9
  "is_world_process_zero": true,
10
  "log_history": [
11
  {
12
- "epoch": 0.018452908178328904,
13
- "grad_norm": 2.7571725845336914,
14
- "learning_rate": 4.969245153036119e-05,
15
- "loss": 0.3765,
16
  "step": 500
17
  },
18
  {
19
- "epoch": 0.03690581635665781,
20
- "grad_norm": 2.832648515701294,
21
- "learning_rate": 4.938490306072237e-05,
22
- "loss": 0.2708,
23
  "step": 1000
24
  },
25
  {
26
- "epoch": 0.05535872453498671,
27
- "grad_norm": 1.4104365110397339,
28
- "learning_rate": 4.907735459108356e-05,
29
- "loss": 0.2557,
30
  "step": 1500
31
  },
32
  {
33
- "epoch": 0.07381163271331562,
34
- "grad_norm": 1.8531866073608398,
35
- "learning_rate": 4.876980612144474e-05,
36
- "loss": 0.2635,
37
  "step": 2000
38
  },
39
  {
40
- "epoch": 0.09226454089164453,
41
- "grad_norm": 1.649173378944397,
42
- "learning_rate": 4.846225765180593e-05,
43
- "loss": 0.2558,
44
  "step": 2500
45
  },
46
  {
47
- "epoch": 0.11071744906997343,
48
- "grad_norm": 1.7052029371261597,
49
- "learning_rate": 4.815470918216711e-05,
50
- "loss": 0.2514,
51
  "step": 3000
52
  },
53
  {
54
- "epoch": 0.12917035724830234,
55
- "grad_norm": 3.926635980606079,
56
- "learning_rate": 4.78471607125283e-05,
57
- "loss": 0.252,
58
  "step": 3500
59
  },
60
  {
61
- "epoch": 0.14762326542663123,
62
- "grad_norm": 3.181887626647949,
63
- "learning_rate": 4.7539612242889484e-05,
64
- "loss": 0.2541,
65
  "step": 4000
66
  },
67
  {
68
- "epoch": 0.16607617360496013,
69
- "grad_norm": 4.0558180809021,
70
- "learning_rate": 4.723206377325067e-05,
71
- "loss": 0.2421,
72
  "step": 4500
73
  },
74
  {
75
- "epoch": 0.18452908178328906,
76
- "grad_norm": 1.432974934577942,
77
- "learning_rate": 4.692451530361185e-05,
78
- "loss": 0.2362,
79
  "step": 5000
80
  },
81
  {
82
- "epoch": 0.20298198996161795,
83
- "grad_norm": 3.173771858215332,
84
- "learning_rate": 4.661696683397304e-05,
85
- "loss": 0.2443,
86
  "step": 5500
87
  },
88
  {
89
- "epoch": 0.22143489813994685,
90
- "grad_norm": 2.175633668899536,
91
- "learning_rate": 4.6309418364334224e-05,
92
- "loss": 0.2329,
93
  "step": 6000
94
  },
95
  {
96
- "epoch": 0.23988780631827575,
97
- "grad_norm": 4.211012840270996,
98
- "learning_rate": 4.60018698946954e-05,
99
- "loss": 0.2303,
100
  "step": 6500
101
  },
102
  {
103
- "epoch": 0.2583407144966047,
104
- "grad_norm": 1.5053297281265259,
105
- "learning_rate": 4.5694321425056594e-05,
106
- "loss": 0.2272,
107
  "step": 7000
108
  },
109
  {
110
- "epoch": 0.27679362267493357,
111
- "grad_norm": 2.2658045291900635,
112
- "learning_rate": 4.538677295541778e-05,
113
- "loss": 0.2309,
114
  "step": 7500
115
  },
116
  {
117
- "epoch": 0.29524653085326247,
118
- "grad_norm": 3.0872204303741455,
119
- "learning_rate": 4.507922448577896e-05,
120
- "loss": 0.228,
 
 
 
 
 
 
 
 
 
121
  "step": 8000
122
  },
123
  {
124
- "epoch": 0.31369943903159137,
125
- "grad_norm": 1.5754343271255493,
126
- "learning_rate": 4.477167601614014e-05,
127
- "loss": 0.2344,
128
  "step": 8500
129
  },
130
  {
131
- "epoch": 0.33215234720992026,
132
- "grad_norm": 8.282055854797363,
133
- "learning_rate": 4.4464127546501335e-05,
134
- "loss": 0.2235,
135
  "step": 9000
136
  },
137
  {
138
- "epoch": 0.3506052553882492,
139
- "grad_norm": 2.818925619125366,
140
- "learning_rate": 4.415657907686251e-05,
141
- "loss": 0.225,
142
  "step": 9500
143
  },
144
  {
145
- "epoch": 0.3690581635665781,
146
- "grad_norm": 4.582856178283691,
147
- "learning_rate": 4.38490306072237e-05,
148
- "loss": 0.2195,
149
  "step": 10000
150
  },
151
  {
152
- "epoch": 0.387511071744907,
153
- "grad_norm": 4.176349639892578,
154
- "learning_rate": 4.354148213758489e-05,
155
- "loss": 0.2249,
156
  "step": 10500
157
  },
158
  {
159
- "epoch": 0.4059639799232359,
160
- "grad_norm": 1.69513738155365,
161
- "learning_rate": 4.323393366794607e-05,
162
- "loss": 0.2227,
163
  "step": 11000
164
  },
165
  {
166
- "epoch": 0.4244168881015648,
167
- "grad_norm": 2.0948939323425293,
168
- "learning_rate": 4.2926385198307254e-05,
169
- "loss": 0.2248,
170
  "step": 11500
171
  },
172
  {
173
- "epoch": 0.4428697962798937,
174
- "grad_norm": 2.4989616870880127,
175
- "learning_rate": 4.261883672866844e-05,
176
- "loss": 0.2194,
177
  "step": 12000
178
  },
179
  {
180
- "epoch": 0.4613227044582226,
181
- "grad_norm": 1.1772059202194214,
182
- "learning_rate": 4.2311288259029624e-05,
183
- "loss": 0.2232,
184
  "step": 12500
185
  },
186
  {
187
- "epoch": 0.4797756126365515,
188
- "grad_norm": 5.26480770111084,
189
- "learning_rate": 4.200373978939081e-05,
190
- "loss": 0.2199,
191
  "step": 13000
192
  },
193
  {
194
- "epoch": 0.49822852081488045,
195
- "grad_norm": 1.3563578128814697,
196
- "learning_rate": 4.1696191319751994e-05,
197
- "loss": 0.2264,
198
  "step": 13500
199
  },
200
  {
201
- "epoch": 0.5166814289932093,
202
- "grad_norm": 1.2438708543777466,
203
- "learning_rate": 4.138864285011318e-05,
204
- "loss": 0.2239,
205
  "step": 14000
206
  },
207
  {
208
- "epoch": 0.5351343371715382,
209
- "grad_norm": 2.229975700378418,
210
- "learning_rate": 4.1081094380474365e-05,
211
- "loss": 0.211,
212
  "step": 14500
213
  },
214
  {
215
- "epoch": 0.5535872453498671,
216
- "grad_norm": 1.4763661623001099,
217
- "learning_rate": 4.077354591083555e-05,
218
- "loss": 0.2176,
219
  "step": 15000
220
  },
221
  {
222
- "epoch": 0.572040153528196,
223
- "grad_norm": 2.88029408454895,
224
- "learning_rate": 4.0465997441196735e-05,
225
- "loss": 0.2229,
226
  "step": 15500
227
  },
228
  {
229
- "epoch": 0.5904930617065249,
230
- "grad_norm": 0.7661384344100952,
231
- "learning_rate": 4.015844897155792e-05,
232
- "loss": 0.2195,
 
 
 
 
 
 
 
 
 
233
  "step": 16000
234
  },
235
  {
236
- "epoch": 0.6089459698848538,
237
- "grad_norm": 2.0358428955078125,
238
- "learning_rate": 3.9850900501919105e-05,
239
- "loss": 0.2161,
240
  "step": 16500
241
  },
242
  {
243
- "epoch": 0.6273988780631827,
244
- "grad_norm": 1.9549895524978638,
245
- "learning_rate": 3.954335203228029e-05,
246
- "loss": 0.2193,
247
  "step": 17000
248
  },
249
  {
250
- "epoch": 0.6458517862415116,
251
- "grad_norm": 2.1742184162139893,
252
- "learning_rate": 3.9235803562641475e-05,
253
- "loss": 0.2171,
254
  "step": 17500
255
  },
256
  {
257
- "epoch": 0.6643046944198405,
258
- "grad_norm": 1.1012811660766602,
259
- "learning_rate": 3.892825509300266e-05,
260
- "loss": 0.2246,
261
  "step": 18000
262
  },
263
  {
264
- "epoch": 0.6827576025981694,
265
- "grad_norm": 2.7291996479034424,
266
- "learning_rate": 3.8620706623363846e-05,
267
- "loss": 0.2114,
268
  "step": 18500
269
  },
270
  {
271
- "epoch": 0.7012105107764984,
272
- "grad_norm": 1.3418771028518677,
273
- "learning_rate": 3.8313158153725024e-05,
274
- "loss": 0.2173,
275
  "step": 19000
276
  },
277
  {
278
- "epoch": 0.7196634189548273,
279
- "grad_norm": 2.7479825019836426,
280
- "learning_rate": 3.8005609684086216e-05,
281
- "loss": 0.2163,
282
  "step": 19500
283
  },
284
  {
285
- "epoch": 0.7381163271331562,
286
- "grad_norm": 1.7314202785491943,
287
- "learning_rate": 3.76980612144474e-05,
288
- "loss": 0.2142,
289
  "step": 20000
290
  },
291
  {
292
- "epoch": 0.7565692353114851,
293
- "grad_norm": 1.5135014057159424,
294
- "learning_rate": 3.739051274480858e-05,
295
- "loss": 0.2156,
296
  "step": 20500
297
  },
298
  {
299
- "epoch": 0.775022143489814,
300
- "grad_norm": 0.9992055296897888,
301
- "learning_rate": 3.708296427516977e-05,
302
- "loss": 0.2136,
303
  "step": 21000
304
  },
305
  {
306
- "epoch": 0.7934750516681429,
307
- "grad_norm": 1.2363203763961792,
308
- "learning_rate": 3.6775415805530957e-05,
309
- "loss": 0.2134,
310
  "step": 21500
311
  },
312
  {
313
- "epoch": 0.8119279598464718,
314
- "grad_norm": 1.8317536115646362,
315
- "learning_rate": 3.6467867335892135e-05,
316
- "loss": 0.217,
317
  "step": 22000
318
  },
319
  {
320
- "epoch": 0.8303808680248007,
321
- "grad_norm": 1.7996548414230347,
322
- "learning_rate": 3.616031886625332e-05,
323
- "loss": 0.213,
324
  "step": 22500
325
  },
326
  {
327
- "epoch": 0.8488337762031296,
328
- "grad_norm": 1.1373772621154785,
329
- "learning_rate": 3.585277039661451e-05,
330
- "loss": 0.2249,
331
  "step": 23000
332
  },
333
- {
334
- "epoch": 0.8672866843814585,
335
- "grad_norm": 1.2996028661727905,
336
- "learning_rate": 3.554522192697569e-05,
337
- "loss": 0.207,
338
- "step": 23500
339
- },
340
- {
341
- "epoch": 0.8857395925597874,
342
- "grad_norm": 1.505035638809204,
343
- "learning_rate": 3.5237673457336876e-05,
344
- "loss": 0.2119,
345
- "step": 24000
346
- },
347
- {
348
- "epoch": 0.9041925007381163,
349
- "grad_norm": 1.2497526407241821,
350
- "learning_rate": 3.493012498769807e-05,
351
- "loss": 0.2096,
352
- "step": 24500
353
- },
354
- {
355
- "epoch": 0.9226454089164452,
356
- "grad_norm": 2.1352574825286865,
357
- "learning_rate": 3.4622576518059246e-05,
358
- "loss": 0.2143,
359
- "step": 25000
360
- },
361
- {
362
- "epoch": 0.9410983170947741,
363
- "grad_norm": 1.664171576499939,
364
- "learning_rate": 3.431502804842043e-05,
365
- "loss": 0.2036,
366
- "step": 25500
367
- },
368
- {
369
- "epoch": 0.959551225273103,
370
- "grad_norm": 2.7897629737854004,
371
- "learning_rate": 3.400747957878162e-05,
372
- "loss": 0.2118,
373
- "step": 26000
374
- },
375
- {
376
- "epoch": 0.978004133451432,
377
- "grad_norm": 1.0113285779953003,
378
- "learning_rate": 3.36999311091428e-05,
379
- "loss": 0.2175,
380
- "step": 26500
381
- },
382
- {
383
- "epoch": 0.9964570416297609,
384
- "grad_norm": 2.9997363090515137,
385
- "learning_rate": 3.3392382639503986e-05,
386
- "loss": 0.2104,
387
- "step": 27000
388
- },
389
- {
390
- "epoch": 1.0,
391
- "eval_loss": 0.20612339675426483,
392
- "eval_mse": 0.20612340591663994,
393
- "eval_runtime": 57.193,
394
- "eval_samples_per_second": 1684.438,
395
- "eval_steps_per_second": 210.568,
396
- "step": 27096
397
- },
398
- {
399
- "epoch": 1.0149099498080898,
400
- "grad_norm": 1.4240479469299316,
401
- "learning_rate": 3.308483416986517e-05,
402
- "loss": 0.1821,
403
- "step": 27500
404
- },
405
- {
406
- "epoch": 1.0333628579864187,
407
- "grad_norm": 1.0634160041809082,
408
- "learning_rate": 3.277728570022636e-05,
409
- "loss": 0.1745,
410
- "step": 28000
411
- },
412
- {
413
- "epoch": 1.0518157661647476,
414
- "grad_norm": 1.9994093179702759,
415
- "learning_rate": 3.246973723058754e-05,
416
- "loss": 0.1712,
417
- "step": 28500
418
- },
419
- {
420
- "epoch": 1.0702686743430765,
421
- "grad_norm": 0.736122727394104,
422
- "learning_rate": 3.216218876094873e-05,
423
- "loss": 0.1738,
424
- "step": 29000
425
- },
426
- {
427
- "epoch": 1.0887215825214054,
428
- "grad_norm": 1.7938990592956543,
429
- "learning_rate": 3.185464029130991e-05,
430
- "loss": 0.1698,
431
- "step": 29500
432
- },
433
- {
434
- "epoch": 1.1071744906997343,
435
- "grad_norm": 1.9040451049804688,
436
- "learning_rate": 3.15470918216711e-05,
437
- "loss": 0.1734,
438
- "step": 30000
439
- },
440
- {
441
- "epoch": 1.1256273988780632,
442
- "grad_norm": 1.222025990486145,
443
- "learning_rate": 3.123954335203228e-05,
444
- "loss": 0.1715,
445
- "step": 30500
446
- },
447
- {
448
- "epoch": 1.144080307056392,
449
- "grad_norm": 1.4371784925460815,
450
- "learning_rate": 3.093199488239347e-05,
451
- "loss": 0.1688,
452
- "step": 31000
453
- },
454
- {
455
- "epoch": 1.162533215234721,
456
- "grad_norm": 5.807870864868164,
457
- "learning_rate": 3.062444641275465e-05,
458
- "loss": 0.179,
459
- "step": 31500
460
- },
461
- {
462
- "epoch": 1.1809861234130499,
463
- "grad_norm": 1.3887362480163574,
464
- "learning_rate": 3.0316897943115834e-05,
465
- "loss": 0.179,
466
- "step": 32000
467
- },
468
- {
469
- "epoch": 1.1994390315913788,
470
- "grad_norm": 2.2503085136413574,
471
- "learning_rate": 3.0009349473477023e-05,
472
- "loss": 0.1738,
473
- "step": 32500
474
- },
475
- {
476
- "epoch": 1.2178919397697077,
477
- "grad_norm": 2.3477783203125,
478
- "learning_rate": 2.9701801003838208e-05,
479
- "loss": 0.1722,
480
- "step": 33000
481
- },
482
- {
483
- "epoch": 1.2363448479480366,
484
- "grad_norm": 2.7416176795959473,
485
- "learning_rate": 2.939425253419939e-05,
486
- "loss": 0.1786,
487
- "step": 33500
488
- },
489
- {
490
- "epoch": 1.2547977561263655,
491
- "grad_norm": 0.7052303552627563,
492
- "learning_rate": 2.9086704064560578e-05,
493
- "loss": 0.1728,
494
- "step": 34000
495
- },
496
- {
497
- "epoch": 1.2732506643046944,
498
- "grad_norm": 2.529670000076294,
499
- "learning_rate": 2.877915559492176e-05,
500
- "loss": 0.1741,
501
- "step": 34500
502
- },
503
- {
504
- "epoch": 1.2917035724830233,
505
- "grad_norm": 1.9189903736114502,
506
- "learning_rate": 2.8471607125282945e-05,
507
- "loss": 0.1762,
508
- "step": 35000
509
- },
510
- {
511
- "epoch": 1.3101564806613522,
512
- "grad_norm": 2.1008570194244385,
513
- "learning_rate": 2.8164058655644134e-05,
514
- "loss": 0.1719,
515
- "step": 35500
516
- },
517
- {
518
- "epoch": 1.328609388839681,
519
- "grad_norm": 2.663116216659546,
520
- "learning_rate": 2.7856510186005312e-05,
521
- "loss": 0.1739,
522
- "step": 36000
523
- },
524
- {
525
- "epoch": 1.34706229701801,
526
- "grad_norm": 3.453697443008423,
527
- "learning_rate": 2.75489617163665e-05,
528
- "loss": 0.1691,
529
- "step": 36500
530
- },
531
- {
532
- "epoch": 1.3655152051963388,
533
- "grad_norm": 5.848513603210449,
534
- "learning_rate": 2.7241413246727686e-05,
535
- "loss": 0.1674,
536
- "step": 37000
537
- },
538
- {
539
- "epoch": 1.3839681133746677,
540
- "grad_norm": 1.1454991102218628,
541
- "learning_rate": 2.6933864777088867e-05,
542
- "loss": 0.1764,
543
- "step": 37500
544
- },
545
- {
546
- "epoch": 1.4024210215529966,
547
- "grad_norm": 0.9938109517097473,
548
- "learning_rate": 2.6626316307450056e-05,
549
- "loss": 0.1706,
550
- "step": 38000
551
- },
552
- {
553
- "epoch": 1.4208739297313255,
554
- "grad_norm": 2.252068042755127,
555
- "learning_rate": 2.631876783781124e-05,
556
- "loss": 0.1665,
557
- "step": 38500
558
- },
559
- {
560
- "epoch": 1.4393268379096544,
561
- "grad_norm": 1.9789129495620728,
562
- "learning_rate": 2.6011219368172423e-05,
563
- "loss": 0.1746,
564
- "step": 39000
565
- },
566
- {
567
- "epoch": 1.4577797460879833,
568
- "grad_norm": 1.5638952255249023,
569
- "learning_rate": 2.570367089853361e-05,
570
- "loss": 0.1699,
571
- "step": 39500
572
- },
573
- {
574
- "epoch": 1.4762326542663124,
575
- "grad_norm": 2.094984292984009,
576
- "learning_rate": 2.5396122428894797e-05,
577
- "loss": 0.1715,
578
- "step": 40000
579
- },
580
- {
581
- "epoch": 1.4946855624446413,
582
- "grad_norm": 2.625145435333252,
583
- "learning_rate": 2.508857395925598e-05,
584
- "loss": 0.1708,
585
- "step": 40500
586
- },
587
- {
588
- "epoch": 1.51313847062297,
589
- "grad_norm": 1.2873293161392212,
590
- "learning_rate": 2.4781025489617167e-05,
591
- "loss": 0.1721,
592
- "step": 41000
593
- },
594
- {
595
- "epoch": 1.531591378801299,
596
- "grad_norm": 2.8465254306793213,
597
- "learning_rate": 2.447347701997835e-05,
598
- "loss": 0.1761,
599
- "step": 41500
600
- },
601
- {
602
- "epoch": 1.550044286979628,
603
- "grad_norm": 1.4705593585968018,
604
- "learning_rate": 2.4165928550339534e-05,
605
- "loss": 0.1756,
606
- "step": 42000
607
- },
608
- {
609
- "epoch": 1.568497195157957,
610
- "grad_norm": 0.9254862666130066,
611
- "learning_rate": 2.3858380080700722e-05,
612
- "loss": 0.1759,
613
- "step": 42500
614
- },
615
- {
616
- "epoch": 1.5869501033362858,
617
- "grad_norm": 1.8685784339904785,
618
- "learning_rate": 2.3550831611061904e-05,
619
- "loss": 0.167,
620
- "step": 43000
621
- },
622
- {
623
- "epoch": 1.6054030115146147,
624
- "grad_norm": 1.4468207359313965,
625
- "learning_rate": 2.324328314142309e-05,
626
- "loss": 0.1776,
627
- "step": 43500
628
- },
629
- {
630
- "epoch": 1.6238559196929436,
631
- "grad_norm": 2.477450132369995,
632
- "learning_rate": 2.2935734671784274e-05,
633
- "loss": 0.174,
634
- "step": 44000
635
- },
636
- {
637
- "epoch": 1.6423088278712725,
638
- "grad_norm": 11.740235328674316,
639
- "learning_rate": 2.262818620214546e-05,
640
- "loss": 0.1652,
641
- "step": 44500
642
- },
643
- {
644
- "epoch": 1.6607617360496014,
645
- "grad_norm": 2.253143548965454,
646
- "learning_rate": 2.2320637732506645e-05,
647
- "loss": 0.1776,
648
- "step": 45000
649
- },
650
- {
651
- "epoch": 1.6792146442279303,
652
- "grad_norm": 4.1611151695251465,
653
- "learning_rate": 2.201308926286783e-05,
654
- "loss": 0.1643,
655
- "step": 45500
656
- },
657
- {
658
- "epoch": 1.6976675524062592,
659
- "grad_norm": 3.693655252456665,
660
- "learning_rate": 2.1705540793229015e-05,
661
- "loss": 0.1705,
662
- "step": 46000
663
- },
664
- {
665
- "epoch": 1.7161204605845881,
666
- "grad_norm": 3.8450114727020264,
667
- "learning_rate": 2.13979923235902e-05,
668
- "loss": 0.1715,
669
- "step": 46500
670
- },
671
- {
672
- "epoch": 1.734573368762917,
673
- "grad_norm": 3.296321392059326,
674
- "learning_rate": 2.1090443853951382e-05,
675
- "loss": 0.1642,
676
- "step": 47000
677
- },
678
- {
679
- "epoch": 1.753026276941246,
680
- "grad_norm": 2.0819671154022217,
681
- "learning_rate": 2.0782895384312567e-05,
682
- "loss": 0.1624,
683
- "step": 47500
684
- },
685
- {
686
- "epoch": 1.7714791851195748,
687
- "grad_norm": 0.8893182873725891,
688
- "learning_rate": 2.0475346914673755e-05,
689
- "loss": 0.1678,
690
- "step": 48000
691
- },
692
- {
693
- "epoch": 1.7899320932979037,
694
- "grad_norm": 2.971529960632324,
695
- "learning_rate": 2.0167798445034937e-05,
696
- "loss": 0.1664,
697
- "step": 48500
698
- },
699
- {
700
- "epoch": 1.8083850014762326,
701
- "grad_norm": 2.0590310096740723,
702
- "learning_rate": 1.9860249975396122e-05,
703
- "loss": 0.1779,
704
- "step": 49000
705
- },
706
- {
707
- "epoch": 1.8268379096545617,
708
- "grad_norm": 2.0498523712158203,
709
- "learning_rate": 1.955270150575731e-05,
710
- "loss": 0.1695,
711
- "step": 49500
712
- },
713
- {
714
- "epoch": 1.8452908178328906,
715
- "grad_norm": 1.6503143310546875,
716
- "learning_rate": 1.9245153036118493e-05,
717
- "loss": 0.1678,
718
- "step": 50000
719
- },
720
- {
721
- "epoch": 1.8637437260112195,
722
- "grad_norm": 1.0318537950515747,
723
- "learning_rate": 1.8937604566479678e-05,
724
- "loss": 0.1644,
725
- "step": 50500
726
- },
727
- {
728
- "epoch": 1.8821966341895484,
729
- "grad_norm": 1.936584711074829,
730
- "learning_rate": 1.8630056096840863e-05,
731
- "loss": 0.1697,
732
- "step": 51000
733
- },
734
- {
735
- "epoch": 1.9006495423678773,
736
- "grad_norm": 2.5828168392181396,
737
- "learning_rate": 1.8322507627202048e-05,
738
- "loss": 0.1696,
739
- "step": 51500
740
- },
741
- {
742
- "epoch": 1.9191024505462062,
743
- "grad_norm": 3.156874895095825,
744
- "learning_rate": 1.8014959157563233e-05,
745
- "loss": 0.1673,
746
- "step": 52000
747
- },
748
- {
749
- "epoch": 1.937555358724535,
750
- "grad_norm": 3.178074836730957,
751
- "learning_rate": 1.7707410687924418e-05,
752
- "loss": 0.176,
753
- "step": 52500
754
- },
755
- {
756
- "epoch": 1.956008266902864,
757
- "grad_norm": 1.48374342918396,
758
- "learning_rate": 1.7399862218285603e-05,
759
- "loss": 0.1677,
760
- "step": 53000
761
- },
762
- {
763
- "epoch": 1.974461175081193,
764
- "grad_norm": 3.43747878074646,
765
- "learning_rate": 1.709231374864679e-05,
766
- "loss": 0.1696,
767
- "step": 53500
768
- },
769
- {
770
- "epoch": 1.9929140832595218,
771
- "grad_norm": 1.6862876415252686,
772
- "learning_rate": 1.678476527900797e-05,
773
- "loss": 0.1718,
774
- "step": 54000
775
- },
776
- {
777
- "epoch": 2.0,
778
- "eval_loss": 0.20655478537082672,
779
- "eval_mse": 0.2065547745621827,
780
- "eval_runtime": 52.3234,
781
- "eval_samples_per_second": 1841.202,
782
- "eval_steps_per_second": 230.165,
783
- "step": 54192
784
- },
785
- {
786
- "epoch": 2.0113669914378507,
787
- "grad_norm": 0.8314543962478638,
788
- "learning_rate": 1.647721680936916e-05,
789
- "loss": 0.1449,
790
- "step": 54500
791
- },
792
- {
793
- "epoch": 2.0298198996161796,
794
- "grad_norm": 1.8953380584716797,
795
- "learning_rate": 1.6169668339730344e-05,
796
- "loss": 0.1357,
797
- "step": 55000
798
- },
799
- {
800
- "epoch": 2.0482728077945085,
801
- "grad_norm": 0.7893266081809998,
802
- "learning_rate": 1.5862119870091526e-05,
803
- "loss": 0.138,
804
- "step": 55500
805
- },
806
- {
807
- "epoch": 2.0667257159728374,
808
- "grad_norm": 1.337292194366455,
809
- "learning_rate": 1.555457140045271e-05,
810
- "loss": 0.1407,
811
- "step": 56000
812
- },
813
- {
814
- "epoch": 2.0851786241511663,
815
- "grad_norm": 1.6890192031860352,
816
- "learning_rate": 1.5247022930813898e-05,
817
- "loss": 0.1406,
818
- "step": 56500
819
- },
820
- {
821
- "epoch": 2.103631532329495,
822
- "grad_norm": 2.1817214488983154,
823
- "learning_rate": 1.4939474461175081e-05,
824
- "loss": 0.1332,
825
- "step": 57000
826
- },
827
- {
828
- "epoch": 2.122084440507824,
829
- "grad_norm": 1.477333664894104,
830
- "learning_rate": 1.4631925991536266e-05,
831
- "loss": 0.1415,
832
- "step": 57500
833
- },
834
- {
835
- "epoch": 2.140537348686153,
836
- "grad_norm": 3.889193534851074,
837
- "learning_rate": 1.4324377521897453e-05,
838
- "loss": 0.1399,
839
- "step": 58000
840
- },
841
- {
842
- "epoch": 2.158990256864482,
843
- "grad_norm": 11.35392951965332,
844
- "learning_rate": 1.4016829052258637e-05,
845
- "loss": 0.1345,
846
- "step": 58500
847
- },
848
- {
849
- "epoch": 2.1774431650428108,
850
- "grad_norm": 2.2750699520111084,
851
- "learning_rate": 1.3709280582619822e-05,
852
- "loss": 0.1347,
853
- "step": 59000
854
- },
855
- {
856
- "epoch": 2.1958960732211397,
857
- "grad_norm": 4.66851282119751,
858
- "learning_rate": 1.3401732112981005e-05,
859
- "loss": 0.1359,
860
- "step": 59500
861
- },
862
- {
863
- "epoch": 2.2143489813994686,
864
- "grad_norm": 1.2594196796417236,
865
- "learning_rate": 1.3094183643342192e-05,
866
- "loss": 0.135,
867
- "step": 60000
868
- },
869
- {
870
- "epoch": 2.2328018895777975,
871
- "grad_norm": 0.6602271199226379,
872
- "learning_rate": 1.2786635173703375e-05,
873
- "loss": 0.1381,
874
- "step": 60500
875
- },
876
- {
877
- "epoch": 2.2512547977561264,
878
- "grad_norm": 0.8580902814865112,
879
- "learning_rate": 1.2479086704064562e-05,
880
- "loss": 0.1308,
881
- "step": 61000
882
- },
883
- {
884
- "epoch": 2.2697077059344553,
885
- "grad_norm": 0.8672662377357483,
886
- "learning_rate": 1.2171538234425746e-05,
887
- "loss": 0.1395,
888
- "step": 61500
889
- },
890
- {
891
- "epoch": 2.288160614112784,
892
- "grad_norm": 1.646864891052246,
893
- "learning_rate": 1.186398976478693e-05,
894
- "loss": 0.1419,
895
- "step": 62000
896
- },
897
- {
898
- "epoch": 2.306613522291113,
899
- "grad_norm": 4.04207181930542,
900
- "learning_rate": 1.1556441295148116e-05,
901
- "loss": 0.1337,
902
- "step": 62500
903
- },
904
- {
905
- "epoch": 2.325066430469442,
906
- "grad_norm": 5.613555431365967,
907
- "learning_rate": 1.1248892825509301e-05,
908
- "loss": 0.1429,
909
- "step": 63000
910
- },
911
- {
912
- "epoch": 2.343519338647771,
913
- "grad_norm": 1.977729082107544,
914
- "learning_rate": 1.0941344355870485e-05,
915
- "loss": 0.1323,
916
- "step": 63500
917
- },
918
- {
919
- "epoch": 2.3619722468260997,
920
- "grad_norm": 1.2868248224258423,
921
- "learning_rate": 1.0633795886231671e-05,
922
- "loss": 0.1383,
923
- "step": 64000
924
- },
925
- {
926
- "epoch": 2.3804251550044286,
927
- "grad_norm": 1.098742961883545,
928
- "learning_rate": 1.0326247416592857e-05,
929
- "loss": 0.1387,
930
- "step": 64500
931
- },
932
- {
933
- "epoch": 2.3988780631827575,
934
- "grad_norm": 2.9264678955078125,
935
- "learning_rate": 1.001869894695404e-05,
936
- "loss": 0.1386,
937
- "step": 65000
938
- },
939
- {
940
- "epoch": 2.4173309713610864,
941
- "grad_norm": 3.179082155227661,
942
- "learning_rate": 9.711150477315225e-06,
943
- "loss": 0.1444,
944
- "step": 65500
945
- },
946
- {
947
- "epoch": 2.4357838795394153,
948
- "grad_norm": 1.5083171129226685,
949
- "learning_rate": 9.40360200767641e-06,
950
- "loss": 0.1351,
951
- "step": 66000
952
- },
953
- {
954
- "epoch": 2.4542367877177442,
955
- "grad_norm": 1.590307354927063,
956
- "learning_rate": 9.096053538037595e-06,
957
- "loss": 0.1379,
958
- "step": 66500
959
- },
960
- {
961
- "epoch": 2.472689695896073,
962
- "grad_norm": 1.490502953529358,
963
- "learning_rate": 8.78850506839878e-06,
964
- "loss": 0.1285,
965
- "step": 67000
966
- },
967
- {
968
- "epoch": 2.491142604074402,
969
- "grad_norm": 2.0561413764953613,
970
- "learning_rate": 8.480956598759966e-06,
971
- "loss": 0.1396,
972
- "step": 67500
973
- },
974
- {
975
- "epoch": 2.509595512252731,
976
- "grad_norm": 1.0588093996047974,
977
- "learning_rate": 8.17340812912115e-06,
978
- "loss": 0.1367,
979
- "step": 68000
980
- },
981
- {
982
- "epoch": 2.52804842043106,
983
- "grad_norm": 0.8184725046157837,
984
- "learning_rate": 7.865859659482334e-06,
985
- "loss": 0.1322,
986
- "step": 68500
987
- },
988
- {
989
- "epoch": 2.5465013286093887,
990
- "grad_norm": 1.3976045846939087,
991
- "learning_rate": 7.55831118984352e-06,
992
- "loss": 0.1332,
993
- "step": 69000
994
- },
995
- {
996
- "epoch": 2.5649542367877176,
997
- "grad_norm": 2.417647361755371,
998
- "learning_rate": 7.250762720204704e-06,
999
- "loss": 0.1342,
1000
- "step": 69500
1001
- },
1002
- {
1003
- "epoch": 2.5834071449660465,
1004
- "grad_norm": 4.064483165740967,
1005
- "learning_rate": 6.94321425056589e-06,
1006
- "loss": 0.1355,
1007
- "step": 70000
1008
- },
1009
- {
1010
- "epoch": 2.6018600531443754,
1011
- "grad_norm": 2.23105788230896,
1012
- "learning_rate": 6.635665780927075e-06,
1013
- "loss": 0.1315,
1014
- "step": 70500
1015
- },
1016
- {
1017
- "epoch": 2.6203129613227043,
1018
- "grad_norm": 2.205604076385498,
1019
- "learning_rate": 6.328117311288259e-06,
1020
- "loss": 0.1379,
1021
- "step": 71000
1022
- },
1023
- {
1024
- "epoch": 2.638765869501033,
1025
- "grad_norm": 2.5101168155670166,
1026
- "learning_rate": 6.020568841649444e-06,
1027
- "loss": 0.142,
1028
- "step": 71500
1029
- },
1030
- {
1031
- "epoch": 2.657218777679362,
1032
- "grad_norm": 11.855621337890625,
1033
- "learning_rate": 5.713020372010629e-06,
1034
- "loss": 0.1359,
1035
- "step": 72000
1036
- },
1037
- {
1038
- "epoch": 2.675671685857691,
1039
- "grad_norm": 1.7274291515350342,
1040
- "learning_rate": 5.4054719023718145e-06,
1041
- "loss": 0.1386,
1042
- "step": 72500
1043
- },
1044
- {
1045
- "epoch": 2.69412459403602,
1046
- "grad_norm": 1.0947271585464478,
1047
- "learning_rate": 5.097923432732999e-06,
1048
- "loss": 0.1393,
1049
- "step": 73000
1050
- },
1051
- {
1052
- "epoch": 2.712577502214349,
1053
- "grad_norm": 1.6208831071853638,
1054
- "learning_rate": 4.790374963094184e-06,
1055
- "loss": 0.1276,
1056
- "step": 73500
1057
- },
1058
- {
1059
- "epoch": 2.7310304103926777,
1060
- "grad_norm": 1.5204744338989258,
1061
- "learning_rate": 4.482826493455368e-06,
1062
- "loss": 0.1297,
1063
- "step": 74000
1064
- },
1065
- {
1066
- "epoch": 2.7494833185710066,
1067
- "grad_norm": 4.482317924499512,
1068
- "learning_rate": 4.175278023816553e-06,
1069
- "loss": 0.1303,
1070
- "step": 74500
1071
- },
1072
- {
1073
- "epoch": 2.7679362267493355,
1074
- "grad_norm": 9.054340362548828,
1075
- "learning_rate": 3.8677295541777385e-06,
1076
- "loss": 0.1319,
1077
- "step": 75000
1078
- },
1079
- {
1080
- "epoch": 2.7863891349276644,
1081
- "grad_norm": 1.8670865297317505,
1082
- "learning_rate": 3.5601810845389237e-06,
1083
- "loss": 0.1301,
1084
- "step": 75500
1085
- },
1086
- {
1087
- "epoch": 2.8048420431059933,
1088
- "grad_norm": 1.451202154159546,
1089
- "learning_rate": 3.2526326149001084e-06,
1090
- "loss": 0.1309,
1091
- "step": 76000
1092
- },
1093
- {
1094
- "epoch": 2.823294951284322,
1095
- "grad_norm": 3.281291961669922,
1096
- "learning_rate": 2.945084145261293e-06,
1097
- "loss": 0.1407,
1098
- "step": 76500
1099
- },
1100
- {
1101
- "epoch": 2.841747859462651,
1102
- "grad_norm": 3.273066997528076,
1103
- "learning_rate": 2.6375356756224782e-06,
1104
- "loss": 0.1267,
1105
- "step": 77000
1106
- },
1107
- {
1108
- "epoch": 2.86020076764098,
1109
- "grad_norm": 8.522459030151367,
1110
- "learning_rate": 2.3299872059836634e-06,
1111
- "loss": 0.1304,
1112
- "step": 77500
1113
- },
1114
- {
1115
- "epoch": 2.878653675819309,
1116
- "grad_norm": 1.6981911659240723,
1117
- "learning_rate": 2.022438736344848e-06,
1118
- "loss": 0.1436,
1119
- "step": 78000
1120
- },
1121
- {
1122
- "epoch": 2.8971065839976378,
1123
- "grad_norm": 2.415241003036499,
1124
- "learning_rate": 1.7148902667060328e-06,
1125
- "loss": 0.1297,
1126
- "step": 78500
1127
- },
1128
- {
1129
- "epoch": 2.9155594921759667,
1130
- "grad_norm": 1.65168035030365,
1131
- "learning_rate": 1.4073417970672177e-06,
1132
- "loss": 0.138,
1133
- "step": 79000
1134
- },
1135
- {
1136
- "epoch": 2.934012400354296,
1137
- "grad_norm": 1.9556164741516113,
1138
- "learning_rate": 1.0997933274284029e-06,
1139
- "loss": 0.1346,
1140
- "step": 79500
1141
- },
1142
- {
1143
- "epoch": 2.952465308532625,
1144
- "grad_norm": 2.9853076934814453,
1145
- "learning_rate": 7.922448577895876e-07,
1146
- "loss": 0.1371,
1147
- "step": 80000
1148
- },
1149
- {
1150
- "epoch": 2.970918216710954,
1151
- "grad_norm": 2.885925054550171,
1152
- "learning_rate": 4.846963881507725e-07,
1153
- "loss": 0.1342,
1154
- "step": 80500
1155
- },
1156
- {
1157
- "epoch": 2.9893711248892827,
1158
- "grad_norm": 2.020306348800659,
1159
- "learning_rate": 1.771479185119575e-07,
1160
- "loss": 0.141,
1161
- "step": 81000
1162
- },
1163
  {
1164
  "epoch": 3.0,
1165
- "eval_loss": 0.19889499247074127,
1166
- "eval_mse": 0.19889500241300032,
1167
- "eval_runtime": 55.3999,
1168
- "eval_samples_per_second": 1738.955,
1169
- "eval_steps_per_second": 217.383,
1170
- "step": 81288
1171
  },
1172
  {
1173
  "epoch": 3.0,
1174
- "step": 81288,
1175
- "total_flos": 4.283504864539085e+16,
1176
- "train_loss": 0.17857247165732115,
1177
- "train_runtime": 4471.1905,
1178
- "train_samples_per_second": 581.752,
1179
- "train_steps_per_second": 18.18
1180
  }
1181
  ],
1182
  "logging_steps": 500,
1183
- "max_steps": 81288,
1184
  "num_input_tokens_seen": 0,
1185
  "num_train_epochs": 3,
1186
  "save_steps": 500,
@@ -1196,8 +384,8 @@
1196
  "attributes": {}
1197
  }
1198
  },
1199
- "total_flos": 4.283504864539085e+16,
1200
- "train_batch_size": 32,
1201
  "trial_name": null,
1202
  "trial_params": null
1203
  }
 
1
  {
2
+ "best_metric": 0.12343376874923706,
3
+ "best_model_checkpoint": "multilingual-e5-small-aligned-readability-20241214-new/checkpoint-23439",
4
  "epoch": 3.0,
5
  "eval_steps": 500,
6
+ "global_step": 23439,
7
  "is_hyper_param_search": false,
8
  "is_local_process_zero": true,
9
  "is_world_process_zero": true,
10
  "log_history": [
11
  {
12
+ "epoch": 0.06399590426212723,
13
+ "grad_norm": 2.0385684967041016,
14
+ "learning_rate": 4.8933401595631215e-05,
15
+ "loss": 0.286,
16
  "step": 500
17
  },
18
  {
19
+ "epoch": 0.12799180852425446,
20
+ "grad_norm": 1.2255290746688843,
21
+ "learning_rate": 4.786680319126243e-05,
22
+ "loss": 0.1864,
23
  "step": 1000
24
  },
25
  {
26
+ "epoch": 0.19198771278638166,
27
+ "grad_norm": 1.411687970161438,
28
+ "learning_rate": 4.680020478689364e-05,
29
+ "loss": 0.1751,
30
  "step": 1500
31
  },
32
  {
33
+ "epoch": 0.2559836170485089,
34
+ "grad_norm": 0.9454971551895142,
35
+ "learning_rate": 4.573360638252485e-05,
36
+ "loss": 0.1695,
37
  "step": 2000
38
  },
39
  {
40
+ "epoch": 0.3199795213106361,
41
+ "grad_norm": 1.2582781314849854,
42
+ "learning_rate": 4.4667007978156063e-05,
43
+ "loss": 0.1666,
44
  "step": 2500
45
  },
46
  {
47
+ "epoch": 0.3839754255727633,
48
+ "grad_norm": 1.103311538696289,
49
+ "learning_rate": 4.360040957378728e-05,
50
+ "loss": 0.1645,
51
  "step": 3000
52
  },
53
  {
54
+ "epoch": 0.4479713298348906,
55
+ "grad_norm": 1.237697958946228,
56
+ "learning_rate": 4.2533811169418495e-05,
57
+ "loss": 0.154,
58
  "step": 3500
59
  },
60
  {
61
+ "epoch": 0.5119672340970178,
62
+ "grad_norm": 1.2233279943466187,
63
+ "learning_rate": 4.146721276504971e-05,
64
+ "loss": 0.1554,
65
  "step": 4000
66
  },
67
  {
68
+ "epoch": 0.575963138359145,
69
+ "grad_norm": 1.345629334449768,
70
+ "learning_rate": 4.040061436068092e-05,
71
+ "loss": 0.1585,
72
  "step": 4500
73
  },
74
  {
75
+ "epoch": 0.6399590426212722,
76
+ "grad_norm": 1.7818764448165894,
77
+ "learning_rate": 3.933401595631213e-05,
78
+ "loss": 0.1523,
79
  "step": 5000
80
  },
81
  {
82
+ "epoch": 0.7039549468833994,
83
+ "grad_norm": 2.2489941120147705,
84
+ "learning_rate": 3.8267417551943344e-05,
85
+ "loss": 0.1546,
86
  "step": 5500
87
  },
88
  {
89
+ "epoch": 0.7679508511455266,
90
+ "grad_norm": 0.6691417098045349,
91
+ "learning_rate": 3.7200819147574556e-05,
92
+ "loss": 0.1485,
93
  "step": 6000
94
  },
95
  {
96
+ "epoch": 0.831946755407654,
97
+ "grad_norm": 1.420074462890625,
98
+ "learning_rate": 3.613422074320577e-05,
99
+ "loss": 0.1479,
100
  "step": 6500
101
  },
102
  {
103
+ "epoch": 0.8959426596697811,
104
+ "grad_norm": 1.6003996133804321,
105
+ "learning_rate": 3.506762233883698e-05,
106
+ "loss": 0.1435,
107
  "step": 7000
108
  },
109
  {
110
+ "epoch": 0.9599385639319084,
111
+ "grad_norm": 1.0591192245483398,
112
+ "learning_rate": 3.400102393446819e-05,
113
+ "loss": 0.1484,
114
  "step": 7500
115
  },
116
  {
117
+ "epoch": 1.0,
118
+ "eval_loss": 0.13237929344177246,
119
+ "eval_mse": 0.1323792923084062,
120
+ "eval_runtime": 106.0386,
121
+ "eval_samples_per_second": 1717.403,
122
+ "eval_steps_per_second": 214.677,
123
+ "step": 7813
124
+ },
125
+ {
126
+ "epoch": 1.0239344681940357,
127
+ "grad_norm": 0.48294901847839355,
128
+ "learning_rate": 3.293442553009941e-05,
129
+ "loss": 0.1372,
130
  "step": 8000
131
  },
132
  {
133
+ "epoch": 1.0879303724561629,
134
+ "grad_norm": 1.516916036605835,
135
+ "learning_rate": 3.1867827125730624e-05,
136
+ "loss": 0.1237,
137
  "step": 8500
138
  },
139
  {
140
+ "epoch": 1.15192627671829,
141
+ "grad_norm": 1.1309595108032227,
142
+ "learning_rate": 3.0801228721361836e-05,
143
+ "loss": 0.1212,
144
  "step": 9000
145
  },
146
  {
147
+ "epoch": 1.2159221809804173,
148
+ "grad_norm": 1.1831127405166626,
149
+ "learning_rate": 2.9734630316993045e-05,
150
+ "loss": 0.1217,
151
  "step": 9500
152
  },
153
  {
154
+ "epoch": 1.2799180852425445,
155
+ "grad_norm": 1.3023440837860107,
156
+ "learning_rate": 2.866803191262426e-05,
157
+ "loss": 0.1233,
158
  "step": 10000
159
  },
160
  {
161
+ "epoch": 1.3439139895046717,
162
+ "grad_norm": 0.9876078963279724,
163
+ "learning_rate": 2.7601433508255476e-05,
164
+ "loss": 0.1213,
165
  "step": 10500
166
  },
167
  {
168
+ "epoch": 1.4079098937667989,
169
+ "grad_norm": 0.9371439218521118,
170
+ "learning_rate": 2.6534835103886685e-05,
171
+ "loss": 0.1232,
172
  "step": 11000
173
  },
174
  {
175
+ "epoch": 1.471905798028926,
176
+ "grad_norm": 0.942789614200592,
177
+ "learning_rate": 2.54682366995179e-05,
178
+ "loss": 0.1202,
179
  "step": 11500
180
  },
181
  {
182
+ "epoch": 1.5359017022910533,
183
+ "grad_norm": 1.451653003692627,
184
+ "learning_rate": 2.4401638295149112e-05,
185
+ "loss": 0.1181,
186
  "step": 12000
187
  },
188
  {
189
+ "epoch": 1.5998976065531805,
190
+ "grad_norm": 0.7980997562408447,
191
+ "learning_rate": 2.3335039890780325e-05,
192
+ "loss": 0.1206,
193
  "step": 12500
194
  },
195
  {
196
+ "epoch": 1.6638935108153077,
197
+ "grad_norm": 1.9100792407989502,
198
+ "learning_rate": 2.2268441486411537e-05,
199
+ "loss": 0.1186,
200
  "step": 13000
201
  },
202
  {
203
+ "epoch": 1.727889415077435,
204
+ "grad_norm": 1.6824227571487427,
205
+ "learning_rate": 2.120184308204275e-05,
206
+ "loss": 0.1207,
207
  "step": 13500
208
  },
209
  {
210
+ "epoch": 1.7918853193395623,
211
+ "grad_norm": 1.731228232383728,
212
+ "learning_rate": 2.0135244677673965e-05,
213
+ "loss": 0.1192,
214
  "step": 14000
215
  },
216
  {
217
+ "epoch": 1.8558812236016895,
218
+ "grad_norm": 1.039025902748108,
219
+ "learning_rate": 1.9068646273305177e-05,
220
+ "loss": 0.1155,
221
  "step": 14500
222
  },
223
  {
224
+ "epoch": 1.9198771278638167,
225
+ "grad_norm": 0.7578993439674377,
226
+ "learning_rate": 1.800204786893639e-05,
227
+ "loss": 0.1173,
228
  "step": 15000
229
  },
230
  {
231
+ "epoch": 1.983873032125944,
232
+ "grad_norm": 1.1431925296783447,
233
+ "learning_rate": 1.69354494645676e-05,
234
+ "loss": 0.1157,
235
  "step": 15500
236
  },
237
  {
238
+ "epoch": 2.0,
239
+ "eval_loss": 0.12407750636339188,
240
+ "eval_mse": 0.12407751065680599,
241
+ "eval_runtime": 102.1252,
242
+ "eval_samples_per_second": 1783.214,
243
+ "eval_steps_per_second": 222.903,
244
+ "step": 15626
245
+ },
246
+ {
247
+ "epoch": 2.0478689363880713,
248
+ "grad_norm": 1.0193284749984741,
249
+ "learning_rate": 1.5868851060198814e-05,
250
+ "loss": 0.1032,
251
  "step": 16000
252
  },
253
  {
254
+ "epoch": 2.1118648406501985,
255
+ "grad_norm": 0.8400812745094299,
256
+ "learning_rate": 1.480225265583003e-05,
257
+ "loss": 0.0983,
258
  "step": 16500
259
  },
260
  {
261
+ "epoch": 2.1758607449123257,
262
+ "grad_norm": 0.8080986738204956,
263
+ "learning_rate": 1.3735654251461241e-05,
264
+ "loss": 0.0991,
265
  "step": 17000
266
  },
267
  {
268
+ "epoch": 2.239856649174453,
269
+ "grad_norm": 1.2077598571777344,
270
+ "learning_rate": 1.2669055847092454e-05,
271
+ "loss": 0.0984,
272
  "step": 17500
273
  },
274
  {
275
+ "epoch": 2.30385255343658,
276
+ "grad_norm": 1.0903464555740356,
277
+ "learning_rate": 1.1602457442723666e-05,
278
+ "loss": 0.0966,
279
  "step": 18000
280
  },
281
  {
282
+ "epoch": 2.3678484576987073,
283
+ "grad_norm": 1.89614737033844,
284
+ "learning_rate": 1.053585903835488e-05,
285
+ "loss": 0.0997,
286
  "step": 18500
287
  },
288
  {
289
+ "epoch": 2.4318443619608345,
290
+ "grad_norm": 0.732280969619751,
291
+ "learning_rate": 9.469260633986092e-06,
292
+ "loss": 0.0969,
293
  "step": 19000
294
  },
295
  {
296
+ "epoch": 2.4958402662229617,
297
+ "grad_norm": 0.8635444641113281,
298
+ "learning_rate": 8.402662229617304e-06,
299
+ "loss": 0.0972,
300
  "step": 19500
301
  },
302
  {
303
+ "epoch": 2.559836170485089,
304
+ "grad_norm": 1.2498939037322998,
305
+ "learning_rate": 7.336063825248518e-06,
306
+ "loss": 0.0971,
307
  "step": 20000
308
  },
309
  {
310
+ "epoch": 2.623832074747216,
311
+ "grad_norm": 1.8945280313491821,
312
+ "learning_rate": 6.26946542087973e-06,
313
+ "loss": 0.0965,
314
  "step": 20500
315
  },
316
  {
317
+ "epoch": 2.6878279790093433,
318
+ "grad_norm": 0.8438006043434143,
319
+ "learning_rate": 5.202867016510943e-06,
320
+ "loss": 0.0964,
321
  "step": 21000
322
  },
323
  {
324
+ "epoch": 2.7518238832714705,
325
+ "grad_norm": 0.7478394508361816,
326
+ "learning_rate": 4.1362686121421564e-06,
327
+ "loss": 0.0987,
328
  "step": 21500
329
  },
330
  {
331
+ "epoch": 2.8158197875335977,
332
+ "grad_norm": 1.5676864385604858,
333
+ "learning_rate": 3.069670207773369e-06,
334
+ "loss": 0.097,
335
  "step": 22000
336
  },
337
  {
338
+ "epoch": 2.879815691795725,
339
+ "grad_norm": 1.0723648071289062,
340
+ "learning_rate": 2.003071803404582e-06,
341
+ "loss": 0.0984,
342
  "step": 22500
343
  },
344
  {
345
+ "epoch": 2.943811596057852,
346
+ "grad_norm": 0.886972963809967,
347
+ "learning_rate": 9.364733990357951e-07,
348
+ "loss": 0.096,
349
  "step": 23000
350
  },
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
351
  {
352
  "epoch": 3.0,
353
+ "eval_loss": 0.12343376874923706,
354
+ "eval_mse": 0.12343375904401642,
355
+ "eval_runtime": 98.083,
356
+ "eval_samples_per_second": 1856.703,
357
+ "eval_steps_per_second": 232.089,
358
+ "step": 23439
359
  },
360
  {
361
  "epoch": 3.0,
362
+ "step": 23439,
363
+ "total_flos": 4.9403660544e+16,
364
+ "train_loss": 0.12799415017126403,
365
+ "train_runtime": 3304.0151,
366
+ "train_samples_per_second": 907.986,
367
+ "train_steps_per_second": 7.094
368
  }
369
  ],
370
  "logging_steps": 500,
371
+ "max_steps": 23439,
372
  "num_input_tokens_seen": 0,
373
  "num_train_epochs": 3,
374
  "save_steps": 500,
 
384
  "attributes": {}
385
  }
386
  },
387
+ "total_flos": 4.9403660544e+16,
388
+ "train_batch_size": 128,
389
  "trial_name": null,
390
  "trial_params": null
391
  }
training_args.bin CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f7846a1e21a9a7ecb44d6f53ed2f2bbd4dbcdeca0216b7cbfc52b373edb803d5
3
  size 5368
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:793b914c08d676ee8da235d2abbfc74fb2138329f5b854e5c11d0eb0e3edc5d7
3
  size 5368