Vishal24 commited on
Commit
b376eb2
·
verified ·
1 Parent(s): 829b213

Upload checkpoint-129030

Browse files
Files changed (5) hide show
  1. optimizer.pt +3 -0
  2. rng_state.pth +3 -0
  3. scheduler.pt +3 -0
  4. trainer_state.json +1919 -0
  5. training_args.bin +3 -0
optimizer.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:4b930d5bcbe3df6d5ee9e15433eeacb1cff6f66fe5522d292d659eb4c38da3c0
3
+ size 866895354
rng_state.pth ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:aa280e2e6dbd3a37c8dc2dc5fe9c7782cf7832367d7c106285ca3d9da6e271f7
3
+ size 14244
scheduler.pt ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:1fe476619a2e0e15fce441e7fcb5a118b511efee40d7b844292db4a9762411b5
3
+ size 1064
trainer_state.json ADDED
@@ -0,0 +1,1919 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "best_metric": null,
3
+ "best_model_checkpoint": null,
4
+ "epoch": 10.0,
5
+ "eval_steps": 500,
6
+ "global_step": 129030,
7
+ "is_hyper_param_search": false,
8
+ "is_local_process_zero": true,
9
+ "is_world_process_zero": true,
10
+ "log_history": [
11
+ {
12
+ "epoch": 0.038750678136867396,
13
+ "grad_norm": 7.044970512390137,
14
+ "learning_rate": 1.9922498643726265e-05,
15
+ "loss": 5.0771,
16
+ "step": 500
17
+ },
18
+ {
19
+ "epoch": 0.07750135627373479,
20
+ "grad_norm": 6.129168510437012,
21
+ "learning_rate": 1.9844997287452532e-05,
22
+ "loss": 4.5935,
23
+ "step": 1000
24
+ },
25
+ {
26
+ "epoch": 0.11625203441060218,
27
+ "grad_norm": 6.968145847320557,
28
+ "learning_rate": 1.9767495931178796e-05,
29
+ "loss": 4.3872,
30
+ "step": 1500
31
+ },
32
+ {
33
+ "epoch": 0.15500271254746958,
34
+ "grad_norm": 6.270899295806885,
35
+ "learning_rate": 1.9689994574905063e-05,
36
+ "loss": 4.2507,
37
+ "step": 2000
38
+ },
39
+ {
40
+ "epoch": 0.19375339068433697,
41
+ "grad_norm": 6.4452104568481445,
42
+ "learning_rate": 1.9612493218631326e-05,
43
+ "loss": 4.135,
44
+ "step": 2500
45
+ },
46
+ {
47
+ "epoch": 0.23250406882120436,
48
+ "grad_norm": 6.564201354980469,
49
+ "learning_rate": 1.9534991862357593e-05,
50
+ "loss": 4.0704,
51
+ "step": 3000
52
+ },
53
+ {
54
+ "epoch": 0.2712547469580718,
55
+ "grad_norm": 6.716593265533447,
56
+ "learning_rate": 1.9457490506083857e-05,
57
+ "loss": 3.9814,
58
+ "step": 3500
59
+ },
60
+ {
61
+ "epoch": 0.31000542509493917,
62
+ "grad_norm": 6.578341484069824,
63
+ "learning_rate": 1.9379989149810124e-05,
64
+ "loss": 3.9078,
65
+ "step": 4000
66
+ },
67
+ {
68
+ "epoch": 0.34875610323180656,
69
+ "grad_norm": 6.6163482666015625,
70
+ "learning_rate": 1.9302487793536387e-05,
71
+ "loss": 3.858,
72
+ "step": 4500
73
+ },
74
+ {
75
+ "epoch": 0.38750678136867395,
76
+ "grad_norm": 5.90346097946167,
77
+ "learning_rate": 1.9224986437262654e-05,
78
+ "loss": 3.8036,
79
+ "step": 5000
80
+ },
81
+ {
82
+ "epoch": 0.42625745950554134,
83
+ "grad_norm": 5.8487467765808105,
84
+ "learning_rate": 1.9147485080988918e-05,
85
+ "loss": 3.7518,
86
+ "step": 5500
87
+ },
88
+ {
89
+ "epoch": 0.4650081376424087,
90
+ "grad_norm": 6.573217391967773,
91
+ "learning_rate": 1.9069983724715185e-05,
92
+ "loss": 3.7164,
93
+ "step": 6000
94
+ },
95
+ {
96
+ "epoch": 0.5037588157792762,
97
+ "grad_norm": 5.970231056213379,
98
+ "learning_rate": 1.899248236844145e-05,
99
+ "loss": 3.673,
100
+ "step": 6500
101
+ },
102
+ {
103
+ "epoch": 0.5425094939161436,
104
+ "grad_norm": 6.683649063110352,
105
+ "learning_rate": 1.8914981012167715e-05,
106
+ "loss": 3.6405,
107
+ "step": 7000
108
+ },
109
+ {
110
+ "epoch": 0.581260172053011,
111
+ "grad_norm": 6.538488864898682,
112
+ "learning_rate": 1.883747965589398e-05,
113
+ "loss": 3.6133,
114
+ "step": 7500
115
+ },
116
+ {
117
+ "epoch": 0.6200108501898783,
118
+ "grad_norm": 6.528162479400635,
119
+ "learning_rate": 1.8759978299620246e-05,
120
+ "loss": 3.5754,
121
+ "step": 8000
122
+ },
123
+ {
124
+ "epoch": 0.6587615283267457,
125
+ "grad_norm": 6.43408203125,
126
+ "learning_rate": 1.868247694334651e-05,
127
+ "loss": 3.5548,
128
+ "step": 8500
129
+ },
130
+ {
131
+ "epoch": 0.6975122064636131,
132
+ "grad_norm": 6.131889820098877,
133
+ "learning_rate": 1.8604975587072776e-05,
134
+ "loss": 3.5228,
135
+ "step": 9000
136
+ },
137
+ {
138
+ "epoch": 0.7362628846004805,
139
+ "grad_norm": 6.320891857147217,
140
+ "learning_rate": 1.852747423079904e-05,
141
+ "loss": 3.4999,
142
+ "step": 9500
143
+ },
144
+ {
145
+ "epoch": 0.7750135627373479,
146
+ "grad_norm": 6.105418682098389,
147
+ "learning_rate": 1.8449972874525307e-05,
148
+ "loss": 3.5071,
149
+ "step": 10000
150
+ },
151
+ {
152
+ "epoch": 0.8137642408742153,
153
+ "grad_norm": 6.774458885192871,
154
+ "learning_rate": 1.837247151825157e-05,
155
+ "loss": 3.4616,
156
+ "step": 10500
157
+ },
158
+ {
159
+ "epoch": 0.8525149190110827,
160
+ "grad_norm": 6.263659477233887,
161
+ "learning_rate": 1.8294970161977838e-05,
162
+ "loss": 3.4499,
163
+ "step": 11000
164
+ },
165
+ {
166
+ "epoch": 0.8912655971479501,
167
+ "grad_norm": 6.58251428604126,
168
+ "learning_rate": 1.82174688057041e-05,
169
+ "loss": 3.4157,
170
+ "step": 11500
171
+ },
172
+ {
173
+ "epoch": 0.9300162752848175,
174
+ "grad_norm": 6.030143737792969,
175
+ "learning_rate": 1.8139967449430368e-05,
176
+ "loss": 3.3842,
177
+ "step": 12000
178
+ },
179
+ {
180
+ "epoch": 0.9687669534216848,
181
+ "grad_norm": 6.361506462097168,
182
+ "learning_rate": 1.806246609315663e-05,
183
+ "loss": 3.3707,
184
+ "step": 12500
185
+ },
186
+ {
187
+ "epoch": 1.0,
188
+ "eval_loss": 3.2508351802825928,
189
+ "eval_runtime": 267.1986,
190
+ "eval_samples_per_second": 772.702,
191
+ "eval_steps_per_second": 12.077,
192
+ "step": 12903
193
+ },
194
+ {
195
+ "epoch": 1.0075176315585523,
196
+ "grad_norm": 6.418643474578857,
197
+ "learning_rate": 1.79849647368829e-05,
198
+ "loss": 3.3673,
199
+ "step": 13000
200
+ },
201
+ {
202
+ "epoch": 1.0462683096954197,
203
+ "grad_norm": 6.310774326324463,
204
+ "learning_rate": 1.7907463380609162e-05,
205
+ "loss": 3.3276,
206
+ "step": 13500
207
+ },
208
+ {
209
+ "epoch": 1.0850189878322871,
210
+ "grad_norm": 6.517366409301758,
211
+ "learning_rate": 1.782996202433543e-05,
212
+ "loss": 3.3288,
213
+ "step": 14000
214
+ },
215
+ {
216
+ "epoch": 1.1237696659691545,
217
+ "grad_norm": 6.407958984375,
218
+ "learning_rate": 1.7752460668061693e-05,
219
+ "loss": 3.3003,
220
+ "step": 14500
221
+ },
222
+ {
223
+ "epoch": 1.162520344106022,
224
+ "grad_norm": 6.145129203796387,
225
+ "learning_rate": 1.767495931178796e-05,
226
+ "loss": 3.2694,
227
+ "step": 15000
228
+ },
229
+ {
230
+ "epoch": 1.2012710222428893,
231
+ "grad_norm": 6.586604118347168,
232
+ "learning_rate": 1.7597457955514223e-05,
233
+ "loss": 3.2627,
234
+ "step": 15500
235
+ },
236
+ {
237
+ "epoch": 1.2400217003797567,
238
+ "grad_norm": 6.122056007385254,
239
+ "learning_rate": 1.751995659924049e-05,
240
+ "loss": 3.2631,
241
+ "step": 16000
242
+ },
243
+ {
244
+ "epoch": 1.278772378516624,
245
+ "grad_norm": 6.545727252960205,
246
+ "learning_rate": 1.7442455242966754e-05,
247
+ "loss": 3.2324,
248
+ "step": 16500
249
+ },
250
+ {
251
+ "epoch": 1.3175230566534915,
252
+ "grad_norm": 6.427816390991211,
253
+ "learning_rate": 1.7364953886693017e-05,
254
+ "loss": 3.227,
255
+ "step": 17000
256
+ },
257
+ {
258
+ "epoch": 1.3562737347903588,
259
+ "grad_norm": 6.253689765930176,
260
+ "learning_rate": 1.7287452530419284e-05,
261
+ "loss": 3.2099,
262
+ "step": 17500
263
+ },
264
+ {
265
+ "epoch": 1.3950244129272262,
266
+ "grad_norm": 6.5702080726623535,
267
+ "learning_rate": 1.7209951174145548e-05,
268
+ "loss": 3.2102,
269
+ "step": 18000
270
+ },
271
+ {
272
+ "epoch": 1.4337750910640936,
273
+ "grad_norm": 6.4822564125061035,
274
+ "learning_rate": 1.7132449817871815e-05,
275
+ "loss": 3.1935,
276
+ "step": 18500
277
+ },
278
+ {
279
+ "epoch": 1.472525769200961,
280
+ "grad_norm": 6.524315357208252,
281
+ "learning_rate": 1.705494846159808e-05,
282
+ "loss": 3.1955,
283
+ "step": 19000
284
+ },
285
+ {
286
+ "epoch": 1.5112764473378284,
287
+ "grad_norm": 6.302344799041748,
288
+ "learning_rate": 1.6977447105324345e-05,
289
+ "loss": 3.1726,
290
+ "step": 19500
291
+ },
292
+ {
293
+ "epoch": 1.5500271254746958,
294
+ "grad_norm": 5.837028503417969,
295
+ "learning_rate": 1.689994574905061e-05,
296
+ "loss": 3.1277,
297
+ "step": 20000
298
+ },
299
+ {
300
+ "epoch": 1.5887778036115632,
301
+ "grad_norm": 6.489377975463867,
302
+ "learning_rate": 1.6822444392776876e-05,
303
+ "loss": 3.1414,
304
+ "step": 20500
305
+ },
306
+ {
307
+ "epoch": 1.6275284817484306,
308
+ "grad_norm": 6.543872833251953,
309
+ "learning_rate": 1.674494303650314e-05,
310
+ "loss": 3.104,
311
+ "step": 21000
312
+ },
313
+ {
314
+ "epoch": 1.666279159885298,
315
+ "grad_norm": 6.05628776550293,
316
+ "learning_rate": 1.6667441680229406e-05,
317
+ "loss": 3.1459,
318
+ "step": 21500
319
+ },
320
+ {
321
+ "epoch": 1.7050298380221653,
322
+ "grad_norm": 6.027078151702881,
323
+ "learning_rate": 1.658994032395567e-05,
324
+ "loss": 3.0963,
325
+ "step": 22000
326
+ },
327
+ {
328
+ "epoch": 1.7437805161590327,
329
+ "grad_norm": 6.577582359313965,
330
+ "learning_rate": 1.6512438967681937e-05,
331
+ "loss": 3.118,
332
+ "step": 22500
333
+ },
334
+ {
335
+ "epoch": 1.7825311942959001,
336
+ "grad_norm": 5.9164204597473145,
337
+ "learning_rate": 1.64349376114082e-05,
338
+ "loss": 3.0928,
339
+ "step": 23000
340
+ },
341
+ {
342
+ "epoch": 1.8212818724327677,
343
+ "grad_norm": 6.155348300933838,
344
+ "learning_rate": 1.6357436255134468e-05,
345
+ "loss": 3.0885,
346
+ "step": 23500
347
+ },
348
+ {
349
+ "epoch": 1.8600325505696351,
350
+ "grad_norm": 6.302849769592285,
351
+ "learning_rate": 1.627993489886073e-05,
352
+ "loss": 3.0741,
353
+ "step": 24000
354
+ },
355
+ {
356
+ "epoch": 1.8987832287065025,
357
+ "grad_norm": 6.140907287597656,
358
+ "learning_rate": 1.6202433542586998e-05,
359
+ "loss": 3.0633,
360
+ "step": 24500
361
+ },
362
+ {
363
+ "epoch": 1.93753390684337,
364
+ "grad_norm": 5.85639762878418,
365
+ "learning_rate": 1.612493218631326e-05,
366
+ "loss": 3.0401,
367
+ "step": 25000
368
+ },
369
+ {
370
+ "epoch": 1.9762845849802373,
371
+ "grad_norm": 6.558920383453369,
372
+ "learning_rate": 1.604743083003953e-05,
373
+ "loss": 3.05,
374
+ "step": 25500
375
+ },
376
+ {
377
+ "epoch": 2.0,
378
+ "eval_loss": 2.9521713256835938,
379
+ "eval_runtime": 258.4886,
380
+ "eval_samples_per_second": 798.739,
381
+ "eval_steps_per_second": 12.484,
382
+ "step": 25806
383
+ },
384
+ {
385
+ "epoch": 2.0150352631171047,
386
+ "grad_norm": 6.003655433654785,
387
+ "learning_rate": 1.5969929473765792e-05,
388
+ "loss": 3.0292,
389
+ "step": 26000
390
+ },
391
+ {
392
+ "epoch": 2.053785941253972,
393
+ "grad_norm": 6.43280029296875,
394
+ "learning_rate": 1.589242811749206e-05,
395
+ "loss": 3.0205,
396
+ "step": 26500
397
+ },
398
+ {
399
+ "epoch": 2.0925366193908395,
400
+ "grad_norm": 6.051511287689209,
401
+ "learning_rate": 1.5814926761218323e-05,
402
+ "loss": 3.0152,
403
+ "step": 27000
404
+ },
405
+ {
406
+ "epoch": 2.131287297527707,
407
+ "grad_norm": 7.381418704986572,
408
+ "learning_rate": 1.573742540494459e-05,
409
+ "loss": 3.0067,
410
+ "step": 27500
411
+ },
412
+ {
413
+ "epoch": 2.1700379756645742,
414
+ "grad_norm": 6.032004356384277,
415
+ "learning_rate": 1.5659924048670853e-05,
416
+ "loss": 2.9821,
417
+ "step": 28000
418
+ },
419
+ {
420
+ "epoch": 2.2087886538014416,
421
+ "grad_norm": 6.481622695922852,
422
+ "learning_rate": 1.558242269239712e-05,
423
+ "loss": 2.9824,
424
+ "step": 28500
425
+ },
426
+ {
427
+ "epoch": 2.247539331938309,
428
+ "grad_norm": 5.934979438781738,
429
+ "learning_rate": 1.5504921336123384e-05,
430
+ "loss": 2.9708,
431
+ "step": 29000
432
+ },
433
+ {
434
+ "epoch": 2.2862900100751764,
435
+ "grad_norm": 7.498392581939697,
436
+ "learning_rate": 1.542741997984965e-05,
437
+ "loss": 2.9836,
438
+ "step": 29500
439
+ },
440
+ {
441
+ "epoch": 2.325040688212044,
442
+ "grad_norm": 6.350077152252197,
443
+ "learning_rate": 1.5349918623575914e-05,
444
+ "loss": 2.9608,
445
+ "step": 30000
446
+ },
447
+ {
448
+ "epoch": 2.363791366348911,
449
+ "grad_norm": 5.6795783042907715,
450
+ "learning_rate": 1.527241726730218e-05,
451
+ "loss": 2.9551,
452
+ "step": 30500
453
+ },
454
+ {
455
+ "epoch": 2.4025420444857786,
456
+ "grad_norm": 6.395376682281494,
457
+ "learning_rate": 1.5194915911028445e-05,
458
+ "loss": 2.9551,
459
+ "step": 31000
460
+ },
461
+ {
462
+ "epoch": 2.441292722622646,
463
+ "grad_norm": 6.238061904907227,
464
+ "learning_rate": 1.511741455475471e-05,
465
+ "loss": 2.9527,
466
+ "step": 31500
467
+ },
468
+ {
469
+ "epoch": 2.4800434007595134,
470
+ "grad_norm": 6.641284465789795,
471
+ "learning_rate": 1.5039913198480975e-05,
472
+ "loss": 2.9444,
473
+ "step": 32000
474
+ },
475
+ {
476
+ "epoch": 2.5187940788963807,
477
+ "grad_norm": 6.30321741104126,
478
+ "learning_rate": 1.496241184220724e-05,
479
+ "loss": 2.9346,
480
+ "step": 32500
481
+ },
482
+ {
483
+ "epoch": 2.557544757033248,
484
+ "grad_norm": 8.681157112121582,
485
+ "learning_rate": 1.4884910485933506e-05,
486
+ "loss": 2.9204,
487
+ "step": 33000
488
+ },
489
+ {
490
+ "epoch": 2.5962954351701155,
491
+ "grad_norm": 6.423407077789307,
492
+ "learning_rate": 1.4807409129659771e-05,
493
+ "loss": 2.9066,
494
+ "step": 33500
495
+ },
496
+ {
497
+ "epoch": 2.635046113306983,
498
+ "grad_norm": 6.697604179382324,
499
+ "learning_rate": 1.4729907773386036e-05,
500
+ "loss": 2.9079,
501
+ "step": 34000
502
+ },
503
+ {
504
+ "epoch": 2.6737967914438503,
505
+ "grad_norm": 6.646244049072266,
506
+ "learning_rate": 1.4652406417112302e-05,
507
+ "loss": 2.9148,
508
+ "step": 34500
509
+ },
510
+ {
511
+ "epoch": 2.7125474695807177,
512
+ "grad_norm": 6.80411958694458,
513
+ "learning_rate": 1.4574905060838567e-05,
514
+ "loss": 2.9005,
515
+ "step": 35000
516
+ },
517
+ {
518
+ "epoch": 2.751298147717585,
519
+ "grad_norm": 6.345988750457764,
520
+ "learning_rate": 1.4497403704564832e-05,
521
+ "loss": 2.888,
522
+ "step": 35500
523
+ },
524
+ {
525
+ "epoch": 2.7900488258544525,
526
+ "grad_norm": 5.965686798095703,
527
+ "learning_rate": 1.4419902348291098e-05,
528
+ "loss": 2.8845,
529
+ "step": 36000
530
+ },
531
+ {
532
+ "epoch": 2.82879950399132,
533
+ "grad_norm": 6.068357944488525,
534
+ "learning_rate": 1.4342400992017363e-05,
535
+ "loss": 2.8835,
536
+ "step": 36500
537
+ },
538
+ {
539
+ "epoch": 2.8675501821281872,
540
+ "grad_norm": 5.874370098114014,
541
+ "learning_rate": 1.4264899635743628e-05,
542
+ "loss": 2.8904,
543
+ "step": 37000
544
+ },
545
+ {
546
+ "epoch": 2.9063008602650546,
547
+ "grad_norm": 6.0566935539245605,
548
+ "learning_rate": 1.4187398279469893e-05,
549
+ "loss": 2.8823,
550
+ "step": 37500
551
+ },
552
+ {
553
+ "epoch": 2.945051538401922,
554
+ "grad_norm": 6.21787691116333,
555
+ "learning_rate": 1.4109896923196159e-05,
556
+ "loss": 2.867,
557
+ "step": 38000
558
+ },
559
+ {
560
+ "epoch": 2.9838022165387894,
561
+ "grad_norm": 6.055897235870361,
562
+ "learning_rate": 1.4032395566922424e-05,
563
+ "loss": 2.867,
564
+ "step": 38500
565
+ },
566
+ {
567
+ "epoch": 3.0,
568
+ "eval_loss": 2.775702953338623,
569
+ "eval_runtime": 259.0101,
570
+ "eval_samples_per_second": 797.131,
571
+ "eval_steps_per_second": 12.459,
572
+ "step": 38709
573
+ },
574
+ {
575
+ "epoch": 3.022552894675657,
576
+ "grad_norm": 5.503760814666748,
577
+ "learning_rate": 1.3954894210648689e-05,
578
+ "loss": 2.8428,
579
+ "step": 39000
580
+ },
581
+ {
582
+ "epoch": 3.061303572812524,
583
+ "grad_norm": 6.250561714172363,
584
+ "learning_rate": 1.3877392854374954e-05,
585
+ "loss": 2.842,
586
+ "step": 39500
587
+ },
588
+ {
589
+ "epoch": 3.1000542509493916,
590
+ "grad_norm": 6.394408226013184,
591
+ "learning_rate": 1.379989149810122e-05,
592
+ "loss": 2.8368,
593
+ "step": 40000
594
+ },
595
+ {
596
+ "epoch": 3.138804929086259,
597
+ "grad_norm": 5.7096428871154785,
598
+ "learning_rate": 1.3722390141827483e-05,
599
+ "loss": 2.8253,
600
+ "step": 40500
601
+ },
602
+ {
603
+ "epoch": 3.1775556072231264,
604
+ "grad_norm": 6.807374000549316,
605
+ "learning_rate": 1.3644888785553749e-05,
606
+ "loss": 2.821,
607
+ "step": 41000
608
+ },
609
+ {
610
+ "epoch": 3.2163062853599937,
611
+ "grad_norm": 6.367000102996826,
612
+ "learning_rate": 1.3567387429280014e-05,
613
+ "loss": 2.8302,
614
+ "step": 41500
615
+ },
616
+ {
617
+ "epoch": 3.255056963496861,
618
+ "grad_norm": 6.30033540725708,
619
+ "learning_rate": 1.3489886073006279e-05,
620
+ "loss": 2.8191,
621
+ "step": 42000
622
+ },
623
+ {
624
+ "epoch": 3.2938076416337285,
625
+ "grad_norm": 7.257653713226318,
626
+ "learning_rate": 1.3412384716732544e-05,
627
+ "loss": 2.8196,
628
+ "step": 42500
629
+ },
630
+ {
631
+ "epoch": 3.332558319770596,
632
+ "grad_norm": 7.1162109375,
633
+ "learning_rate": 1.333488336045881e-05,
634
+ "loss": 2.817,
635
+ "step": 43000
636
+ },
637
+ {
638
+ "epoch": 3.3713089979074633,
639
+ "grad_norm": 6.336881160736084,
640
+ "learning_rate": 1.3257382004185075e-05,
641
+ "loss": 2.8064,
642
+ "step": 43500
643
+ },
644
+ {
645
+ "epoch": 3.4100596760443307,
646
+ "grad_norm": 6.641462326049805,
647
+ "learning_rate": 1.317988064791134e-05,
648
+ "loss": 2.8035,
649
+ "step": 44000
650
+ },
651
+ {
652
+ "epoch": 3.448810354181198,
653
+ "grad_norm": 6.033754348754883,
654
+ "learning_rate": 1.3102379291637605e-05,
655
+ "loss": 2.7976,
656
+ "step": 44500
657
+ },
658
+ {
659
+ "epoch": 3.4875610323180655,
660
+ "grad_norm": 6.544773101806641,
661
+ "learning_rate": 1.302487793536387e-05,
662
+ "loss": 2.8048,
663
+ "step": 45000
664
+ },
665
+ {
666
+ "epoch": 3.526311710454933,
667
+ "grad_norm": 6.382020950317383,
668
+ "learning_rate": 1.2947376579090136e-05,
669
+ "loss": 2.7982,
670
+ "step": 45500
671
+ },
672
+ {
673
+ "epoch": 3.5650623885918002,
674
+ "grad_norm": 6.194632053375244,
675
+ "learning_rate": 1.2869875222816401e-05,
676
+ "loss": 2.7749,
677
+ "step": 46000
678
+ },
679
+ {
680
+ "epoch": 3.6038130667286676,
681
+ "grad_norm": 6.429641246795654,
682
+ "learning_rate": 1.2792373866542665e-05,
683
+ "loss": 2.7853,
684
+ "step": 46500
685
+ },
686
+ {
687
+ "epoch": 3.642563744865535,
688
+ "grad_norm": 6.209822177886963,
689
+ "learning_rate": 1.271487251026893e-05,
690
+ "loss": 2.7841,
691
+ "step": 47000
692
+ },
693
+ {
694
+ "epoch": 3.6813144230024024,
695
+ "grad_norm": 6.935910701751709,
696
+ "learning_rate": 1.2637371153995195e-05,
697
+ "loss": 2.7681,
698
+ "step": 47500
699
+ },
700
+ {
701
+ "epoch": 3.72006510113927,
702
+ "grad_norm": 7.021639347076416,
703
+ "learning_rate": 1.255986979772146e-05,
704
+ "loss": 2.7658,
705
+ "step": 48000
706
+ },
707
+ {
708
+ "epoch": 3.758815779276137,
709
+ "grad_norm": 6.242121696472168,
710
+ "learning_rate": 1.2482368441447726e-05,
711
+ "loss": 2.7698,
712
+ "step": 48500
713
+ },
714
+ {
715
+ "epoch": 3.7975664574130046,
716
+ "grad_norm": 6.123905658721924,
717
+ "learning_rate": 1.2404867085173991e-05,
718
+ "loss": 2.7711,
719
+ "step": 49000
720
+ },
721
+ {
722
+ "epoch": 3.836317135549872,
723
+ "grad_norm": 6.735771179199219,
724
+ "learning_rate": 1.2327365728900256e-05,
725
+ "loss": 2.726,
726
+ "step": 49500
727
+ },
728
+ {
729
+ "epoch": 3.8750678136867394,
730
+ "grad_norm": 6.921602725982666,
731
+ "learning_rate": 1.2249864372626522e-05,
732
+ "loss": 2.7545,
733
+ "step": 50000
734
+ },
735
+ {
736
+ "epoch": 3.9138184918236067,
737
+ "grad_norm": 6.343456745147705,
738
+ "learning_rate": 1.2172363016352787e-05,
739
+ "loss": 2.7474,
740
+ "step": 50500
741
+ },
742
+ {
743
+ "epoch": 3.9525691699604746,
744
+ "grad_norm": 6.30169677734375,
745
+ "learning_rate": 1.2094861660079052e-05,
746
+ "loss": 2.7467,
747
+ "step": 51000
748
+ },
749
+ {
750
+ "epoch": 3.9913198480973415,
751
+ "grad_norm": 6.6629767417907715,
752
+ "learning_rate": 1.2017360303805317e-05,
753
+ "loss": 2.7475,
754
+ "step": 51500
755
+ },
756
+ {
757
+ "epoch": 4.0,
758
+ "eval_loss": 2.6640822887420654,
759
+ "eval_runtime": 260.2494,
760
+ "eval_samples_per_second": 793.335,
761
+ "eval_steps_per_second": 12.4,
762
+ "step": 51612
763
+ },
764
+ {
765
+ "epoch": 4.030070526234209,
766
+ "grad_norm": 6.397671222686768,
767
+ "learning_rate": 1.1939858947531581e-05,
768
+ "loss": 2.7311,
769
+ "step": 52000
770
+ },
771
+ {
772
+ "epoch": 4.068821204371076,
773
+ "grad_norm": 6.374961853027344,
774
+ "learning_rate": 1.1862357591257846e-05,
775
+ "loss": 2.7119,
776
+ "step": 52500
777
+ },
778
+ {
779
+ "epoch": 4.107571882507944,
780
+ "grad_norm": 5.920938968658447,
781
+ "learning_rate": 1.1784856234984112e-05,
782
+ "loss": 2.7217,
783
+ "step": 53000
784
+ },
785
+ {
786
+ "epoch": 4.146322560644811,
787
+ "grad_norm": 6.377143859863281,
788
+ "learning_rate": 1.1707354878710377e-05,
789
+ "loss": 2.7044,
790
+ "step": 53500
791
+ },
792
+ {
793
+ "epoch": 4.185073238781679,
794
+ "grad_norm": 7.047250270843506,
795
+ "learning_rate": 1.1629853522436642e-05,
796
+ "loss": 2.7213,
797
+ "step": 54000
798
+ },
799
+ {
800
+ "epoch": 4.223823916918546,
801
+ "grad_norm": 6.682352066040039,
802
+ "learning_rate": 1.1552352166162907e-05,
803
+ "loss": 2.7025,
804
+ "step": 54500
805
+ },
806
+ {
807
+ "epoch": 4.262574595055414,
808
+ "grad_norm": 6.547230243682861,
809
+ "learning_rate": 1.1474850809889173e-05,
810
+ "loss": 2.7068,
811
+ "step": 55000
812
+ },
813
+ {
814
+ "epoch": 4.301325273192281,
815
+ "grad_norm": 6.038912296295166,
816
+ "learning_rate": 1.1397349453615438e-05,
817
+ "loss": 2.7061,
818
+ "step": 55500
819
+ },
820
+ {
821
+ "epoch": 4.3400759513291485,
822
+ "grad_norm": 6.072612762451172,
823
+ "learning_rate": 1.1319848097341703e-05,
824
+ "loss": 2.7037,
825
+ "step": 56000
826
+ },
827
+ {
828
+ "epoch": 4.378826629466015,
829
+ "grad_norm": 5.6306281089782715,
830
+ "learning_rate": 1.1242346741067968e-05,
831
+ "loss": 2.6999,
832
+ "step": 56500
833
+ },
834
+ {
835
+ "epoch": 4.417577307602883,
836
+ "grad_norm": 6.18297004699707,
837
+ "learning_rate": 1.1164845384794234e-05,
838
+ "loss": 2.6974,
839
+ "step": 57000
840
+ },
841
+ {
842
+ "epoch": 4.45632798573975,
843
+ "grad_norm": 6.371115207672119,
844
+ "learning_rate": 1.1087344028520499e-05,
845
+ "loss": 2.6918,
846
+ "step": 57500
847
+ },
848
+ {
849
+ "epoch": 4.495078663876618,
850
+ "grad_norm": 6.444944381713867,
851
+ "learning_rate": 1.1009842672246764e-05,
852
+ "loss": 2.6874,
853
+ "step": 58000
854
+ },
855
+ {
856
+ "epoch": 4.533829342013485,
857
+ "grad_norm": 6.176960468292236,
858
+ "learning_rate": 1.093234131597303e-05,
859
+ "loss": 2.68,
860
+ "step": 58500
861
+ },
862
+ {
863
+ "epoch": 4.572580020150353,
864
+ "grad_norm": 6.731847763061523,
865
+ "learning_rate": 1.0854839959699295e-05,
866
+ "loss": 2.6919,
867
+ "step": 59000
868
+ },
869
+ {
870
+ "epoch": 4.61133069828722,
871
+ "grad_norm": 7.826213836669922,
872
+ "learning_rate": 1.077733860342556e-05,
873
+ "loss": 2.6824,
874
+ "step": 59500
875
+ },
876
+ {
877
+ "epoch": 4.650081376424088,
878
+ "grad_norm": 7.052020072937012,
879
+ "learning_rate": 1.0699837247151825e-05,
880
+ "loss": 2.6616,
881
+ "step": 60000
882
+ },
883
+ {
884
+ "epoch": 4.6888320545609545,
885
+ "grad_norm": 5.36915922164917,
886
+ "learning_rate": 1.062233589087809e-05,
887
+ "loss": 2.667,
888
+ "step": 60500
889
+ },
890
+ {
891
+ "epoch": 4.727582732697822,
892
+ "grad_norm": 6.491717338562012,
893
+ "learning_rate": 1.0544834534604356e-05,
894
+ "loss": 2.6896,
895
+ "step": 61000
896
+ },
897
+ {
898
+ "epoch": 4.766333410834689,
899
+ "grad_norm": 7.702902793884277,
900
+ "learning_rate": 1.0467333178330621e-05,
901
+ "loss": 2.6712,
902
+ "step": 61500
903
+ },
904
+ {
905
+ "epoch": 4.805084088971557,
906
+ "grad_norm": 6.359930992126465,
907
+ "learning_rate": 1.0389831822056886e-05,
908
+ "loss": 2.6704,
909
+ "step": 62000
910
+ },
911
+ {
912
+ "epoch": 4.843834767108424,
913
+ "grad_norm": 6.2874531745910645,
914
+ "learning_rate": 1.0312330465783152e-05,
915
+ "loss": 2.6757,
916
+ "step": 62500
917
+ },
918
+ {
919
+ "epoch": 4.882585445245292,
920
+ "grad_norm": 6.827906131744385,
921
+ "learning_rate": 1.0234829109509417e-05,
922
+ "loss": 2.6567,
923
+ "step": 63000
924
+ },
925
+ {
926
+ "epoch": 4.921336123382159,
927
+ "grad_norm": 6.620416164398193,
928
+ "learning_rate": 1.0157327753235682e-05,
929
+ "loss": 2.6615,
930
+ "step": 63500
931
+ },
932
+ {
933
+ "epoch": 4.960086801519027,
934
+ "grad_norm": 6.6219162940979,
935
+ "learning_rate": 1.0079826396961947e-05,
936
+ "loss": 2.657,
937
+ "step": 64000
938
+ },
939
+ {
940
+ "epoch": 4.998837479655894,
941
+ "grad_norm": 6.214903831481934,
942
+ "learning_rate": 1.0002325040688213e-05,
943
+ "loss": 2.6549,
944
+ "step": 64500
945
+ },
946
+ {
947
+ "epoch": 5.0,
948
+ "eval_loss": 2.578911066055298,
949
+ "eval_runtime": 265.1883,
950
+ "eval_samples_per_second": 778.56,
951
+ "eval_steps_per_second": 12.169,
952
+ "step": 64515
953
+ },
954
+ {
955
+ "epoch": 5.0375881577927615,
956
+ "grad_norm": 6.627685546875,
957
+ "learning_rate": 9.924823684414478e-06,
958
+ "loss": 2.6203,
959
+ "step": 65000
960
+ },
961
+ {
962
+ "epoch": 5.076338835929628,
963
+ "grad_norm": 6.23040771484375,
964
+ "learning_rate": 9.847322328140743e-06,
965
+ "loss": 2.6349,
966
+ "step": 65500
967
+ },
968
+ {
969
+ "epoch": 5.115089514066496,
970
+ "grad_norm": 6.667369365692139,
971
+ "learning_rate": 9.769820971867009e-06,
972
+ "loss": 2.647,
973
+ "step": 66000
974
+ },
975
+ {
976
+ "epoch": 5.153840192203363,
977
+ "grad_norm": 6.694558620452881,
978
+ "learning_rate": 9.692319615593274e-06,
979
+ "loss": 2.6214,
980
+ "step": 66500
981
+ },
982
+ {
983
+ "epoch": 5.192590870340231,
984
+ "grad_norm": 6.280242443084717,
985
+ "learning_rate": 9.614818259319539e-06,
986
+ "loss": 2.6206,
987
+ "step": 67000
988
+ },
989
+ {
990
+ "epoch": 5.231341548477098,
991
+ "grad_norm": 6.660119533538818,
992
+ "learning_rate": 9.537316903045804e-06,
993
+ "loss": 2.6307,
994
+ "step": 67500
995
+ },
996
+ {
997
+ "epoch": 5.270092226613966,
998
+ "grad_norm": 6.439652919769287,
999
+ "learning_rate": 9.45981554677207e-06,
1000
+ "loss": 2.6431,
1001
+ "step": 68000
1002
+ },
1003
+ {
1004
+ "epoch": 5.308842904750833,
1005
+ "grad_norm": 6.055843830108643,
1006
+ "learning_rate": 9.382314190498335e-06,
1007
+ "loss": 2.6144,
1008
+ "step": 68500
1009
+ },
1010
+ {
1011
+ "epoch": 5.347593582887701,
1012
+ "grad_norm": 6.519714832305908,
1013
+ "learning_rate": 9.3048128342246e-06,
1014
+ "loss": 2.6056,
1015
+ "step": 69000
1016
+ },
1017
+ {
1018
+ "epoch": 5.3863442610245675,
1019
+ "grad_norm": 6.72304630279541,
1020
+ "learning_rate": 9.227311477950864e-06,
1021
+ "loss": 2.623,
1022
+ "step": 69500
1023
+ },
1024
+ {
1025
+ "epoch": 5.425094939161435,
1026
+ "grad_norm": 7.048790454864502,
1027
+ "learning_rate": 9.149810121677129e-06,
1028
+ "loss": 2.6043,
1029
+ "step": 70000
1030
+ },
1031
+ {
1032
+ "epoch": 5.463845617298302,
1033
+ "grad_norm": 6.654219627380371,
1034
+ "learning_rate": 9.072308765403394e-06,
1035
+ "loss": 2.6135,
1036
+ "step": 70500
1037
+ },
1038
+ {
1039
+ "epoch": 5.50259629543517,
1040
+ "grad_norm": 5.948112487792969,
1041
+ "learning_rate": 8.99480740912966e-06,
1042
+ "loss": 2.6295,
1043
+ "step": 71000
1044
+ },
1045
+ {
1046
+ "epoch": 5.541346973572038,
1047
+ "grad_norm": 7.8044328689575195,
1048
+ "learning_rate": 8.917306052855925e-06,
1049
+ "loss": 2.6104,
1050
+ "step": 71500
1051
+ },
1052
+ {
1053
+ "epoch": 5.580097651708905,
1054
+ "grad_norm": 6.743612766265869,
1055
+ "learning_rate": 8.83980469658219e-06,
1056
+ "loss": 2.6216,
1057
+ "step": 72000
1058
+ },
1059
+ {
1060
+ "epoch": 5.618848329845772,
1061
+ "grad_norm": 6.346240043640137,
1062
+ "learning_rate": 8.762303340308455e-06,
1063
+ "loss": 2.6238,
1064
+ "step": 72500
1065
+ },
1066
+ {
1067
+ "epoch": 5.65759900798264,
1068
+ "grad_norm": 6.496920108795166,
1069
+ "learning_rate": 8.68480198403472e-06,
1070
+ "loss": 2.6334,
1071
+ "step": 73000
1072
+ },
1073
+ {
1074
+ "epoch": 5.6963496861195075,
1075
+ "grad_norm": 6.356810569763184,
1076
+ "learning_rate": 8.607300627760986e-06,
1077
+ "loss": 2.5995,
1078
+ "step": 73500
1079
+ },
1080
+ {
1081
+ "epoch": 5.7351003642563745,
1082
+ "grad_norm": 6.226792812347412,
1083
+ "learning_rate": 8.529799271487251e-06,
1084
+ "loss": 2.5974,
1085
+ "step": 74000
1086
+ },
1087
+ {
1088
+ "epoch": 5.773851042393241,
1089
+ "grad_norm": 6.6555962562561035,
1090
+ "learning_rate": 8.452297915213516e-06,
1091
+ "loss": 2.6285,
1092
+ "step": 74500
1093
+ },
1094
+ {
1095
+ "epoch": 5.812601720530109,
1096
+ "grad_norm": 6.32110595703125,
1097
+ "learning_rate": 8.374796558939782e-06,
1098
+ "loss": 2.6035,
1099
+ "step": 75000
1100
+ },
1101
+ {
1102
+ "epoch": 5.851352398666977,
1103
+ "grad_norm": 6.651345252990723,
1104
+ "learning_rate": 8.297295202666047e-06,
1105
+ "loss": 2.5886,
1106
+ "step": 75500
1107
+ },
1108
+ {
1109
+ "epoch": 5.890103076803844,
1110
+ "grad_norm": 6.736583232879639,
1111
+ "learning_rate": 8.219793846392312e-06,
1112
+ "loss": 2.5903,
1113
+ "step": 76000
1114
+ },
1115
+ {
1116
+ "epoch": 5.928853754940711,
1117
+ "grad_norm": 6.635737895965576,
1118
+ "learning_rate": 8.142292490118577e-06,
1119
+ "loss": 2.597,
1120
+ "step": 76500
1121
+ },
1122
+ {
1123
+ "epoch": 5.967604433077579,
1124
+ "grad_norm": 6.3186492919921875,
1125
+ "learning_rate": 8.064791133844843e-06,
1126
+ "loss": 2.5732,
1127
+ "step": 77000
1128
+ },
1129
+ {
1130
+ "epoch": 6.0,
1131
+ "eval_loss": 2.5146169662475586,
1132
+ "eval_runtime": 259.2569,
1133
+ "eval_samples_per_second": 796.372,
1134
+ "eval_steps_per_second": 12.447,
1135
+ "step": 77418
1136
+ },
1137
+ {
1138
+ "epoch": 6.006355111214447,
1139
+ "grad_norm": 6.408041000366211,
1140
+ "learning_rate": 7.987289777571108e-06,
1141
+ "loss": 2.5742,
1142
+ "step": 77500
1143
+ },
1144
+ {
1145
+ "epoch": 6.045105789351314,
1146
+ "grad_norm": 6.398166656494141,
1147
+ "learning_rate": 7.909788421297373e-06,
1148
+ "loss": 2.5829,
1149
+ "step": 78000
1150
+ },
1151
+ {
1152
+ "epoch": 6.083856467488181,
1153
+ "grad_norm": 6.89434289932251,
1154
+ "learning_rate": 7.832287065023639e-06,
1155
+ "loss": 2.58,
1156
+ "step": 78500
1157
+ },
1158
+ {
1159
+ "epoch": 6.122607145625048,
1160
+ "grad_norm": 5.935701847076416,
1161
+ "learning_rate": 7.754785708749904e-06,
1162
+ "loss": 2.5853,
1163
+ "step": 79000
1164
+ },
1165
+ {
1166
+ "epoch": 6.161357823761916,
1167
+ "grad_norm": 7.224461555480957,
1168
+ "learning_rate": 7.677284352476169e-06,
1169
+ "loss": 2.5597,
1170
+ "step": 79500
1171
+ },
1172
+ {
1173
+ "epoch": 6.200108501898783,
1174
+ "grad_norm": 6.59751033782959,
1175
+ "learning_rate": 7.5997829962024335e-06,
1176
+ "loss": 2.5821,
1177
+ "step": 80000
1178
+ },
1179
+ {
1180
+ "epoch": 6.238859180035651,
1181
+ "grad_norm": 6.414103031158447,
1182
+ "learning_rate": 7.522281639928699e-06,
1183
+ "loss": 2.5542,
1184
+ "step": 80500
1185
+ },
1186
+ {
1187
+ "epoch": 6.277609858172518,
1188
+ "grad_norm": 6.270075798034668,
1189
+ "learning_rate": 7.444780283654964e-06,
1190
+ "loss": 2.5735,
1191
+ "step": 81000
1192
+ },
1193
+ {
1194
+ "epoch": 6.316360536309386,
1195
+ "grad_norm": 6.3846306800842285,
1196
+ "learning_rate": 7.367278927381229e-06,
1197
+ "loss": 2.5563,
1198
+ "step": 81500
1199
+ },
1200
+ {
1201
+ "epoch": 6.355111214446253,
1202
+ "grad_norm": 6.725887298583984,
1203
+ "learning_rate": 7.2897775711074945e-06,
1204
+ "loss": 2.5582,
1205
+ "step": 82000
1206
+ },
1207
+ {
1208
+ "epoch": 6.3938618925831205,
1209
+ "grad_norm": 6.913090229034424,
1210
+ "learning_rate": 7.21227621483376e-06,
1211
+ "loss": 2.5681,
1212
+ "step": 82500
1213
+ },
1214
+ {
1215
+ "epoch": 6.4326125707199875,
1216
+ "grad_norm": 6.630814075469971,
1217
+ "learning_rate": 7.134774858560025e-06,
1218
+ "loss": 2.5493,
1219
+ "step": 83000
1220
+ },
1221
+ {
1222
+ "epoch": 6.471363248856855,
1223
+ "grad_norm": 7.482264518737793,
1224
+ "learning_rate": 7.05727350228629e-06,
1225
+ "loss": 2.5672,
1226
+ "step": 83500
1227
+ },
1228
+ {
1229
+ "epoch": 6.510113926993722,
1230
+ "grad_norm": 5.896800518035889,
1231
+ "learning_rate": 6.979772146012556e-06,
1232
+ "loss": 2.5563,
1233
+ "step": 84000
1234
+ },
1235
+ {
1236
+ "epoch": 6.54886460513059,
1237
+ "grad_norm": 6.603734016418457,
1238
+ "learning_rate": 6.902270789738821e-06,
1239
+ "loss": 2.5358,
1240
+ "step": 84500
1241
+ },
1242
+ {
1243
+ "epoch": 6.587615283267457,
1244
+ "grad_norm": 6.386889457702637,
1245
+ "learning_rate": 6.824769433465086e-06,
1246
+ "loss": 2.5449,
1247
+ "step": 85000
1248
+ },
1249
+ {
1250
+ "epoch": 6.626365961404325,
1251
+ "grad_norm": 6.661931037902832,
1252
+ "learning_rate": 6.747268077191351e-06,
1253
+ "loss": 2.5405,
1254
+ "step": 85500
1255
+ },
1256
+ {
1257
+ "epoch": 6.665116639541192,
1258
+ "grad_norm": 6.331045627593994,
1259
+ "learning_rate": 6.669766720917617e-06,
1260
+ "loss": 2.5419,
1261
+ "step": 86000
1262
+ },
1263
+ {
1264
+ "epoch": 6.70386731767806,
1265
+ "grad_norm": 7.050119400024414,
1266
+ "learning_rate": 6.592265364643882e-06,
1267
+ "loss": 2.5196,
1268
+ "step": 86500
1269
+ },
1270
+ {
1271
+ "epoch": 6.742617995814927,
1272
+ "grad_norm": 6.065616130828857,
1273
+ "learning_rate": 6.514764008370147e-06,
1274
+ "loss": 2.539,
1275
+ "step": 87000
1276
+ },
1277
+ {
1278
+ "epoch": 6.781368673951794,
1279
+ "grad_norm": 5.768097877502441,
1280
+ "learning_rate": 6.4372626520964125e-06,
1281
+ "loss": 2.5245,
1282
+ "step": 87500
1283
+ },
1284
+ {
1285
+ "epoch": 6.820119352088661,
1286
+ "grad_norm": 6.785781383514404,
1287
+ "learning_rate": 6.359761295822677e-06,
1288
+ "loss": 2.5473,
1289
+ "step": 88000
1290
+ },
1291
+ {
1292
+ "epoch": 6.858870030225529,
1293
+ "grad_norm": 6.658846855163574,
1294
+ "learning_rate": 6.282259939548942e-06,
1295
+ "loss": 2.5385,
1296
+ "step": 88500
1297
+ },
1298
+ {
1299
+ "epoch": 6.897620708362396,
1300
+ "grad_norm": 5.932773590087891,
1301
+ "learning_rate": 6.2047585832752074e-06,
1302
+ "loss": 2.528,
1303
+ "step": 89000
1304
+ },
1305
+ {
1306
+ "epoch": 6.936371386499264,
1307
+ "grad_norm": 6.457767963409424,
1308
+ "learning_rate": 6.127257227001473e-06,
1309
+ "loss": 2.5327,
1310
+ "step": 89500
1311
+ },
1312
+ {
1313
+ "epoch": 6.975122064636131,
1314
+ "grad_norm": 6.143023490905762,
1315
+ "learning_rate": 6.049755870727738e-06,
1316
+ "loss": 2.5352,
1317
+ "step": 90000
1318
+ },
1319
+ {
1320
+ "epoch": 7.0,
1321
+ "eval_loss": 2.4585013389587402,
1322
+ "eval_runtime": 258.9573,
1323
+ "eval_samples_per_second": 797.294,
1324
+ "eval_steps_per_second": 12.462,
1325
+ "step": 90321
1326
+ },
1327
+ {
1328
+ "epoch": 7.013872742772999,
1329
+ "grad_norm": 6.153046607971191,
1330
+ "learning_rate": 5.972254514454003e-06,
1331
+ "loss": 2.5315,
1332
+ "step": 90500
1333
+ },
1334
+ {
1335
+ "epoch": 7.052623420909866,
1336
+ "grad_norm": 7.131119728088379,
1337
+ "learning_rate": 5.8947531581802685e-06,
1338
+ "loss": 2.5431,
1339
+ "step": 91000
1340
+ },
1341
+ {
1342
+ "epoch": 7.0913740990467335,
1343
+ "grad_norm": 6.677100658416748,
1344
+ "learning_rate": 5.817251801906534e-06,
1345
+ "loss": 2.5204,
1346
+ "step": 91500
1347
+ },
1348
+ {
1349
+ "epoch": 7.1301247771836005,
1350
+ "grad_norm": 6.799976348876953,
1351
+ "learning_rate": 5.739750445632799e-06,
1352
+ "loss": 2.5221,
1353
+ "step": 92000
1354
+ },
1355
+ {
1356
+ "epoch": 7.168875455320468,
1357
+ "grad_norm": 6.515171051025391,
1358
+ "learning_rate": 5.662249089359064e-06,
1359
+ "loss": 2.5222,
1360
+ "step": 92500
1361
+ },
1362
+ {
1363
+ "epoch": 7.207626133457335,
1364
+ "grad_norm": 7.057505130767822,
1365
+ "learning_rate": 5.58474773308533e-06,
1366
+ "loss": 2.5262,
1367
+ "step": 93000
1368
+ },
1369
+ {
1370
+ "epoch": 7.246376811594203,
1371
+ "grad_norm": 5.927343368530273,
1372
+ "learning_rate": 5.507246376811595e-06,
1373
+ "loss": 2.5272,
1374
+ "step": 93500
1375
+ },
1376
+ {
1377
+ "epoch": 7.28512748973107,
1378
+ "grad_norm": 6.7214155197143555,
1379
+ "learning_rate": 5.42974502053786e-06,
1380
+ "loss": 2.5195,
1381
+ "step": 94000
1382
+ },
1383
+ {
1384
+ "epoch": 7.323878167867938,
1385
+ "grad_norm": 6.162799835205078,
1386
+ "learning_rate": 5.352243664264125e-06,
1387
+ "loss": 2.5117,
1388
+ "step": 94500
1389
+ },
1390
+ {
1391
+ "epoch": 7.362628846004805,
1392
+ "grad_norm": 6.725783824920654,
1393
+ "learning_rate": 5.274742307990391e-06,
1394
+ "loss": 2.522,
1395
+ "step": 95000
1396
+ },
1397
+ {
1398
+ "epoch": 7.401379524141673,
1399
+ "grad_norm": 5.721879959106445,
1400
+ "learning_rate": 5.197240951716656e-06,
1401
+ "loss": 2.5047,
1402
+ "step": 95500
1403
+ },
1404
+ {
1405
+ "epoch": 7.44013020227854,
1406
+ "grad_norm": 7.531757354736328,
1407
+ "learning_rate": 5.11973959544292e-06,
1408
+ "loss": 2.4981,
1409
+ "step": 96000
1410
+ },
1411
+ {
1412
+ "epoch": 7.478880880415407,
1413
+ "grad_norm": 6.200819492340088,
1414
+ "learning_rate": 5.042238239169186e-06,
1415
+ "loss": 2.5016,
1416
+ "step": 96500
1417
+ },
1418
+ {
1419
+ "epoch": 7.517631558552274,
1420
+ "grad_norm": 6.8695597648620605,
1421
+ "learning_rate": 4.964736882895451e-06,
1422
+ "loss": 2.5085,
1423
+ "step": 97000
1424
+ },
1425
+ {
1426
+ "epoch": 7.556382236689142,
1427
+ "grad_norm": 6.3883843421936035,
1428
+ "learning_rate": 4.887235526621716e-06,
1429
+ "loss": 2.5092,
1430
+ "step": 97500
1431
+ },
1432
+ {
1433
+ "epoch": 7.595132914826009,
1434
+ "grad_norm": 6.085172653198242,
1435
+ "learning_rate": 4.809734170347981e-06,
1436
+ "loss": 2.4957,
1437
+ "step": 98000
1438
+ },
1439
+ {
1440
+ "epoch": 7.633883592962877,
1441
+ "grad_norm": 6.23600435256958,
1442
+ "learning_rate": 4.732232814074247e-06,
1443
+ "loss": 2.4876,
1444
+ "step": 98500
1445
+ },
1446
+ {
1447
+ "epoch": 7.672634271099744,
1448
+ "grad_norm": 6.483453750610352,
1449
+ "learning_rate": 4.654731457800512e-06,
1450
+ "loss": 2.5029,
1451
+ "step": 99000
1452
+ },
1453
+ {
1454
+ "epoch": 7.711384949236612,
1455
+ "grad_norm": 6.627302646636963,
1456
+ "learning_rate": 4.577230101526777e-06,
1457
+ "loss": 2.4989,
1458
+ "step": 99500
1459
+ },
1460
+ {
1461
+ "epoch": 7.750135627373479,
1462
+ "grad_norm": 7.044070243835449,
1463
+ "learning_rate": 4.4997287452530425e-06,
1464
+ "loss": 2.5085,
1465
+ "step": 100000
1466
+ },
1467
+ {
1468
+ "epoch": 7.7888863055103466,
1469
+ "grad_norm": 5.986552715301514,
1470
+ "learning_rate": 4.422227388979308e-06,
1471
+ "loss": 2.4842,
1472
+ "step": 100500
1473
+ },
1474
+ {
1475
+ "epoch": 7.8276369836472135,
1476
+ "grad_norm": 6.3408708572387695,
1477
+ "learning_rate": 4.344726032705573e-06,
1478
+ "loss": 2.4973,
1479
+ "step": 101000
1480
+ },
1481
+ {
1482
+ "epoch": 7.866387661784081,
1483
+ "grad_norm": 6.100359916687012,
1484
+ "learning_rate": 4.267224676431838e-06,
1485
+ "loss": 2.5111,
1486
+ "step": 101500
1487
+ },
1488
+ {
1489
+ "epoch": 7.905138339920948,
1490
+ "grad_norm": 6.7454833984375,
1491
+ "learning_rate": 4.1897233201581036e-06,
1492
+ "loss": 2.4766,
1493
+ "step": 102000
1494
+ },
1495
+ {
1496
+ "epoch": 7.943889018057816,
1497
+ "grad_norm": 6.790141582489014,
1498
+ "learning_rate": 4.112221963884369e-06,
1499
+ "loss": 2.4788,
1500
+ "step": 102500
1501
+ },
1502
+ {
1503
+ "epoch": 7.982639696194683,
1504
+ "grad_norm": 6.926203727722168,
1505
+ "learning_rate": 4.034720607610634e-06,
1506
+ "loss": 2.4875,
1507
+ "step": 103000
1508
+ },
1509
+ {
1510
+ "epoch": 8.0,
1511
+ "eval_loss": 2.435317277908325,
1512
+ "eval_runtime": 258.5225,
1513
+ "eval_samples_per_second": 798.634,
1514
+ "eval_steps_per_second": 12.482,
1515
+ "step": 103224
1516
+ },
1517
+ {
1518
+ "epoch": 8.02139037433155,
1519
+ "grad_norm": 6.832672119140625,
1520
+ "learning_rate": 3.957219251336899e-06,
1521
+ "loss": 2.4812,
1522
+ "step": 103500
1523
+ },
1524
+ {
1525
+ "epoch": 8.060141052468419,
1526
+ "grad_norm": 6.771292209625244,
1527
+ "learning_rate": 3.879717895063164e-06,
1528
+ "loss": 2.4945,
1529
+ "step": 104000
1530
+ },
1531
+ {
1532
+ "epoch": 8.098891730605285,
1533
+ "grad_norm": 6.624267101287842,
1534
+ "learning_rate": 3.802216538789429e-06,
1535
+ "loss": 2.4813,
1536
+ "step": 104500
1537
+ },
1538
+ {
1539
+ "epoch": 8.137642408742153,
1540
+ "grad_norm": 6.566524028778076,
1541
+ "learning_rate": 3.724715182515694e-06,
1542
+ "loss": 2.5087,
1543
+ "step": 105000
1544
+ },
1545
+ {
1546
+ "epoch": 8.17639308687902,
1547
+ "grad_norm": 6.612277507781982,
1548
+ "learning_rate": 3.647213826241959e-06,
1549
+ "loss": 2.481,
1550
+ "step": 105500
1551
+ },
1552
+ {
1553
+ "epoch": 8.215143765015888,
1554
+ "grad_norm": 6.12284517288208,
1555
+ "learning_rate": 3.5697124699682244e-06,
1556
+ "loss": 2.4825,
1557
+ "step": 106000
1558
+ },
1559
+ {
1560
+ "epoch": 8.253894443152754,
1561
+ "grad_norm": 6.495052814483643,
1562
+ "learning_rate": 3.4922111136944897e-06,
1563
+ "loss": 2.4883,
1564
+ "step": 106500
1565
+ },
1566
+ {
1567
+ "epoch": 8.292645121289622,
1568
+ "grad_norm": 7.689423561096191,
1569
+ "learning_rate": 3.414709757420755e-06,
1570
+ "loss": 2.4857,
1571
+ "step": 107000
1572
+ },
1573
+ {
1574
+ "epoch": 8.33139579942649,
1575
+ "grad_norm": 6.188397407531738,
1576
+ "learning_rate": 3.3372084011470202e-06,
1577
+ "loss": 2.4788,
1578
+ "step": 107500
1579
+ },
1580
+ {
1581
+ "epoch": 8.370146477563358,
1582
+ "grad_norm": 6.282194137573242,
1583
+ "learning_rate": 3.2597070448732855e-06,
1584
+ "loss": 2.4856,
1585
+ "step": 108000
1586
+ },
1587
+ {
1588
+ "epoch": 8.408897155700224,
1589
+ "grad_norm": 6.457098007202148,
1590
+ "learning_rate": 3.1822056885995508e-06,
1591
+ "loss": 2.4623,
1592
+ "step": 108500
1593
+ },
1594
+ {
1595
+ "epoch": 8.447647833837092,
1596
+ "grad_norm": 7.726540565490723,
1597
+ "learning_rate": 3.1047043323258156e-06,
1598
+ "loss": 2.4671,
1599
+ "step": 109000
1600
+ },
1601
+ {
1602
+ "epoch": 8.48639851197396,
1603
+ "grad_norm": 6.308920383453369,
1604
+ "learning_rate": 3.027202976052081e-06,
1605
+ "loss": 2.4808,
1606
+ "step": 109500
1607
+ },
1608
+ {
1609
+ "epoch": 8.525149190110827,
1610
+ "grad_norm": 6.501667499542236,
1611
+ "learning_rate": 2.949701619778346e-06,
1612
+ "loss": 2.4736,
1613
+ "step": 110000
1614
+ },
1615
+ {
1616
+ "epoch": 8.563899868247695,
1617
+ "grad_norm": 7.358393669128418,
1618
+ "learning_rate": 2.8722002635046114e-06,
1619
+ "loss": 2.4697,
1620
+ "step": 110500
1621
+ },
1622
+ {
1623
+ "epoch": 8.602650546384561,
1624
+ "grad_norm": 6.261012554168701,
1625
+ "learning_rate": 2.7946989072308767e-06,
1626
+ "loss": 2.4631,
1627
+ "step": 111000
1628
+ },
1629
+ {
1630
+ "epoch": 8.64140122452143,
1631
+ "grad_norm": 6.515717029571533,
1632
+ "learning_rate": 2.717197550957142e-06,
1633
+ "loss": 2.4915,
1634
+ "step": 111500
1635
+ },
1636
+ {
1637
+ "epoch": 8.680151902658297,
1638
+ "grad_norm": 6.8307600021362305,
1639
+ "learning_rate": 2.6396961946834072e-06,
1640
+ "loss": 2.48,
1641
+ "step": 112000
1642
+ },
1643
+ {
1644
+ "epoch": 8.718902580795163,
1645
+ "grad_norm": 6.784819602966309,
1646
+ "learning_rate": 2.5621948384096725e-06,
1647
+ "loss": 2.4748,
1648
+ "step": 112500
1649
+ },
1650
+ {
1651
+ "epoch": 8.75765325893203,
1652
+ "grad_norm": 7.1304473876953125,
1653
+ "learning_rate": 2.4846934821359373e-06,
1654
+ "loss": 2.4723,
1655
+ "step": 113000
1656
+ },
1657
+ {
1658
+ "epoch": 8.796403937068899,
1659
+ "grad_norm": 6.297511100769043,
1660
+ "learning_rate": 2.4071921258622026e-06,
1661
+ "loss": 2.463,
1662
+ "step": 113500
1663
+ },
1664
+ {
1665
+ "epoch": 8.835154615205767,
1666
+ "grad_norm": 6.689960479736328,
1667
+ "learning_rate": 2.329690769588468e-06,
1668
+ "loss": 2.4621,
1669
+ "step": 114000
1670
+ },
1671
+ {
1672
+ "epoch": 8.873905293342634,
1673
+ "grad_norm": 6.450560569763184,
1674
+ "learning_rate": 2.252189413314733e-06,
1675
+ "loss": 2.4559,
1676
+ "step": 114500
1677
+ },
1678
+ {
1679
+ "epoch": 8.9126559714795,
1680
+ "grad_norm": 6.459935665130615,
1681
+ "learning_rate": 2.1746880570409984e-06,
1682
+ "loss": 2.4646,
1683
+ "step": 115000
1684
+ },
1685
+ {
1686
+ "epoch": 8.951406649616368,
1687
+ "grad_norm": 6.182426452636719,
1688
+ "learning_rate": 2.0971867007672637e-06,
1689
+ "loss": 2.4665,
1690
+ "step": 115500
1691
+ },
1692
+ {
1693
+ "epoch": 8.990157327753236,
1694
+ "grad_norm": 7.122648239135742,
1695
+ "learning_rate": 2.019685344493529e-06,
1696
+ "loss": 2.475,
1697
+ "step": 116000
1698
+ },
1699
+ {
1700
+ "epoch": 9.0,
1701
+ "eval_loss": 2.406507968902588,
1702
+ "eval_runtime": 258.9009,
1703
+ "eval_samples_per_second": 797.467,
1704
+ "eval_steps_per_second": 12.464,
1705
+ "step": 116127
1706
+ },
1707
+ {
1708
+ "epoch": 9.028908005890104,
1709
+ "grad_norm": 7.267585754394531,
1710
+ "learning_rate": 1.942183988219794e-06,
1711
+ "loss": 2.447,
1712
+ "step": 116500
1713
+ },
1714
+ {
1715
+ "epoch": 9.06765868402697,
1716
+ "grad_norm": 6.2447991371154785,
1717
+ "learning_rate": 1.8646826319460593e-06,
1718
+ "loss": 2.4609,
1719
+ "step": 117000
1720
+ },
1721
+ {
1722
+ "epoch": 9.106409362163838,
1723
+ "grad_norm": 6.521481037139893,
1724
+ "learning_rate": 1.7871812756723245e-06,
1725
+ "loss": 2.4418,
1726
+ "step": 117500
1727
+ },
1728
+ {
1729
+ "epoch": 9.145160040300706,
1730
+ "grad_norm": 6.647397041320801,
1731
+ "learning_rate": 1.7096799193985896e-06,
1732
+ "loss": 2.4665,
1733
+ "step": 118000
1734
+ },
1735
+ {
1736
+ "epoch": 9.183910718437573,
1737
+ "grad_norm": 6.247033596038818,
1738
+ "learning_rate": 1.6321785631248548e-06,
1739
+ "loss": 2.4647,
1740
+ "step": 118500
1741
+ },
1742
+ {
1743
+ "epoch": 9.22266139657444,
1744
+ "grad_norm": 6.595357894897461,
1745
+ "learning_rate": 1.5546772068511201e-06,
1746
+ "loss": 2.4705,
1747
+ "step": 119000
1748
+ },
1749
+ {
1750
+ "epoch": 9.261412074711307,
1751
+ "grad_norm": 8.117677688598633,
1752
+ "learning_rate": 1.4771758505773854e-06,
1753
+ "loss": 2.4629,
1754
+ "step": 119500
1755
+ },
1756
+ {
1757
+ "epoch": 9.300162752848175,
1758
+ "grad_norm": 6.991618633270264,
1759
+ "learning_rate": 1.3996744943036504e-06,
1760
+ "loss": 2.4498,
1761
+ "step": 120000
1762
+ },
1763
+ {
1764
+ "epoch": 9.338913430985043,
1765
+ "grad_norm": 6.236393451690674,
1766
+ "learning_rate": 1.3221731380299157e-06,
1767
+ "loss": 2.467,
1768
+ "step": 120500
1769
+ },
1770
+ {
1771
+ "epoch": 9.377664109121909,
1772
+ "grad_norm": 6.595478534698486,
1773
+ "learning_rate": 1.2446717817561808e-06,
1774
+ "loss": 2.4547,
1775
+ "step": 121000
1776
+ },
1777
+ {
1778
+ "epoch": 9.416414787258777,
1779
+ "grad_norm": 7.194475173950195,
1780
+ "learning_rate": 1.167170425482446e-06,
1781
+ "loss": 2.4669,
1782
+ "step": 121500
1783
+ },
1784
+ {
1785
+ "epoch": 9.455165465395645,
1786
+ "grad_norm": 6.341099262237549,
1787
+ "learning_rate": 1.0896690692087113e-06,
1788
+ "loss": 2.4661,
1789
+ "step": 122000
1790
+ },
1791
+ {
1792
+ "epoch": 9.493916143532513,
1793
+ "grad_norm": 7.257521629333496,
1794
+ "learning_rate": 1.0121677129349766e-06,
1795
+ "loss": 2.4629,
1796
+ "step": 122500
1797
+ },
1798
+ {
1799
+ "epoch": 9.532666821669379,
1800
+ "grad_norm": 6.399875164031982,
1801
+ "learning_rate": 9.346663566612417e-07,
1802
+ "loss": 2.4555,
1803
+ "step": 123000
1804
+ },
1805
+ {
1806
+ "epoch": 9.571417499806246,
1807
+ "grad_norm": 7.292248249053955,
1808
+ "learning_rate": 8.571650003875069e-07,
1809
+ "loss": 2.4646,
1810
+ "step": 123500
1811
+ },
1812
+ {
1813
+ "epoch": 9.610168177943114,
1814
+ "grad_norm": 6.8132548332214355,
1815
+ "learning_rate": 7.79663644113772e-07,
1816
+ "loss": 2.4521,
1817
+ "step": 124000
1818
+ },
1819
+ {
1820
+ "epoch": 9.648918856079982,
1821
+ "grad_norm": 6.302210330963135,
1822
+ "learning_rate": 7.021622878400372e-07,
1823
+ "loss": 2.451,
1824
+ "step": 124500
1825
+ },
1826
+ {
1827
+ "epoch": 9.687669534216848,
1828
+ "grad_norm": 6.902337551116943,
1829
+ "learning_rate": 6.246609315663025e-07,
1830
+ "loss": 2.4515,
1831
+ "step": 125000
1832
+ },
1833
+ {
1834
+ "epoch": 9.726420212353716,
1835
+ "grad_norm": 6.4049296379089355,
1836
+ "learning_rate": 5.471595752925676e-07,
1837
+ "loss": 2.454,
1838
+ "step": 125500
1839
+ },
1840
+ {
1841
+ "epoch": 9.765170890490584,
1842
+ "grad_norm": 7.109240531921387,
1843
+ "learning_rate": 4.6965821901883286e-07,
1844
+ "loss": 2.4379,
1845
+ "step": 126000
1846
+ },
1847
+ {
1848
+ "epoch": 9.803921568627452,
1849
+ "grad_norm": 6.1289873123168945,
1850
+ "learning_rate": 3.921568627450981e-07,
1851
+ "loss": 2.4438,
1852
+ "step": 126500
1853
+ },
1854
+ {
1855
+ "epoch": 9.842672246764318,
1856
+ "grad_norm": 6.873955726623535,
1857
+ "learning_rate": 3.146555064713633e-07,
1858
+ "loss": 2.4526,
1859
+ "step": 127000
1860
+ },
1861
+ {
1862
+ "epoch": 9.881422924901186,
1863
+ "grad_norm": 6.842904090881348,
1864
+ "learning_rate": 2.3715415019762845e-07,
1865
+ "loss": 2.4471,
1866
+ "step": 127500
1867
+ },
1868
+ {
1869
+ "epoch": 9.920173603038053,
1870
+ "grad_norm": 9.636740684509277,
1871
+ "learning_rate": 1.5965279392389367e-07,
1872
+ "loss": 2.4469,
1873
+ "step": 128000
1874
+ },
1875
+ {
1876
+ "epoch": 9.958924281174921,
1877
+ "grad_norm": 6.161515235900879,
1878
+ "learning_rate": 8.21514376501589e-08,
1879
+ "loss": 2.4608,
1880
+ "step": 128500
1881
+ },
1882
+ {
1883
+ "epoch": 9.997674959311787,
1884
+ "grad_norm": 6.582516193389893,
1885
+ "learning_rate": 4.6500813764240875e-09,
1886
+ "loss": 2.4411,
1887
+ "step": 129000
1888
+ },
1889
+ {
1890
+ "epoch": 10.0,
1891
+ "eval_loss": 2.3977291584014893,
1892
+ "eval_runtime": 258.9982,
1893
+ "eval_samples_per_second": 797.168,
1894
+ "eval_steps_per_second": 12.46,
1895
+ "step": 129030
1896
+ }
1897
+ ],
1898
+ "logging_steps": 500,
1899
+ "max_steps": 129030,
1900
+ "num_input_tokens_seen": 0,
1901
+ "num_train_epochs": 10,
1902
+ "save_steps": 500,
1903
+ "stateful_callbacks": {
1904
+ "TrainerControl": {
1905
+ "args": {
1906
+ "should_epoch_stop": false,
1907
+ "should_evaluate": false,
1908
+ "should_log": false,
1909
+ "should_save": true,
1910
+ "should_training_stop": true
1911
+ },
1912
+ "attributes": {}
1913
+ }
1914
+ },
1915
+ "total_flos": 3.235049087048045e+17,
1916
+ "train_batch_size": 64,
1917
+ "trial_name": null,
1918
+ "trial_params": null
1919
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:7c29e483a17b334874faa05ddef49273d268f86505fb1cbbeea2a4cddc04a790
3
+ size 5048