austindavis commited on
Commit
d0368b6
·
verified ·
1 Parent(s): 849fa4b

End of training

Browse files
Files changed (3) hide show
  1. README.md +328 -196
  2. generation_config.json +2 -5
  3. model.safetensors +1 -1
README.md CHANGED
@@ -1,199 +1,331 @@
1
  ---
2
- library_name: transformers
3
- tags: []
 
 
 
4
  ---
5
 
6
- # Model Card for Model ID
7
-
8
- <!-- Provide a quick summary of what the model is/does. -->
9
-
10
-
11
-
12
- ## Model Details
13
-
14
- ### Model Description
15
-
16
- <!-- Provide a longer summary of what this model is. -->
17
-
18
- This is the model card of a 🤗 transformers model that has been pushed on the Hub. This model card has been automatically generated.
19
-
20
- - **Developed by:** [More Information Needed]
21
- - **Funded by [optional]:** [More Information Needed]
22
- - **Shared by [optional]:** [More Information Needed]
23
- - **Model type:** [More Information Needed]
24
- - **Language(s) (NLP):** [More Information Needed]
25
- - **License:** [More Information Needed]
26
- - **Finetuned from model [optional]:** [More Information Needed]
27
-
28
- ### Model Sources [optional]
29
-
30
- <!-- Provide the basic links for the model. -->
31
-
32
- - **Repository:** [More Information Needed]
33
- - **Paper [optional]:** [More Information Needed]
34
- - **Demo [optional]:** [More Information Needed]
35
-
36
- ## Uses
37
-
38
- <!-- Address questions around how the model is intended to be used, including the foreseeable users of the model and those affected by the model. -->
39
-
40
- ### Direct Use
41
-
42
- <!-- This section is for the model use without fine-tuning or plugging into a larger ecosystem/app. -->
43
-
44
- [More Information Needed]
45
-
46
- ### Downstream Use [optional]
47
-
48
- <!-- This section is for the model use when fine-tuned for a task, or when plugged into a larger ecosystem/app -->
49
-
50
- [More Information Needed]
51
-
52
- ### Out-of-Scope Use
53
-
54
- <!-- This section addresses misuse, malicious use, and uses that the model will not work well for. -->
55
-
56
- [More Information Needed]
57
-
58
- ## Bias, Risks, and Limitations
59
-
60
- <!-- This section is meant to convey both technical and sociotechnical limitations. -->
61
-
62
- [More Information Needed]
63
-
64
- ### Recommendations
65
-
66
- <!-- This section is meant to convey recommendations with respect to the bias, risk, and technical limitations. -->
67
-
68
- Users (both direct and downstream) should be made aware of the risks, biases and limitations of the model. More information needed for further recommendations.
69
-
70
- ## How to Get Started with the Model
71
-
72
- Use the code below to get started with the model.
73
-
74
- [More Information Needed]
75
-
76
- ## Training Details
77
-
78
- ### Training Data
79
-
80
- <!-- This should link to a Dataset Card, perhaps with a short stub of information on what the training data is all about as well as documentation related to data pre-processing or additional filtering. -->
81
-
82
- [More Information Needed]
83
-
84
- ### Training Procedure
85
-
86
- <!-- This relates heavily to the Technical Specifications. Content here should link to that section when it is relevant to the training procedure. -->
87
-
88
- #### Preprocessing [optional]
89
-
90
- [More Information Needed]
91
-
92
-
93
- #### Training Hyperparameters
94
-
95
- - **Training regime:** [More Information Needed] <!--fp32, fp16 mixed precision, bf16 mixed precision, bf16 non-mixed precision, fp16 non-mixed precision, fp8 mixed precision -->
96
-
97
- #### Speeds, Sizes, Times [optional]
98
-
99
- <!-- This section provides information about throughput, start/end time, checkpoint size if relevant, etc. -->
100
-
101
- [More Information Needed]
102
-
103
- ## Evaluation
104
-
105
- <!-- This section describes the evaluation protocols and provides the results. -->
106
-
107
- ### Testing Data, Factors & Metrics
108
-
109
- #### Testing Data
110
-
111
- <!-- This should link to a Dataset Card if possible. -->
112
-
113
- [More Information Needed]
114
-
115
- #### Factors
116
-
117
- <!-- These are the things the evaluation is disaggregating by, e.g., subpopulations or domains. -->
118
-
119
- [More Information Needed]
120
-
121
- #### Metrics
122
-
123
- <!-- These are the evaluation metrics being used, ideally with a description of why. -->
124
-
125
- [More Information Needed]
126
-
127
- ### Results
128
-
129
- [More Information Needed]
130
-
131
- #### Summary
132
-
133
-
134
-
135
- ## Model Examination [optional]
136
-
137
- <!-- Relevant interpretability work for the model goes here -->
138
-
139
- [More Information Needed]
140
-
141
- ## Environmental Impact
142
-
143
- <!-- Total emissions (in grams of CO2eq) and additional considerations, such as electricity usage, go here. Edit the suggested text below accordingly -->
144
-
145
- Carbon emissions can be estimated using the [Machine Learning Impact calculator](https://mlco2.github.io/impact#compute) presented in [Lacoste et al. (2019)](https://arxiv.org/abs/1910.09700).
146
-
147
- - **Hardware Type:** [More Information Needed]
148
- - **Hours used:** [More Information Needed]
149
- - **Cloud Provider:** [More Information Needed]
150
- - **Compute Region:** [More Information Needed]
151
- - **Carbon Emitted:** [More Information Needed]
152
-
153
- ## Technical Specifications [optional]
154
-
155
- ### Model Architecture and Objective
156
-
157
- [More Information Needed]
158
-
159
- ### Compute Infrastructure
160
-
161
- [More Information Needed]
162
-
163
- #### Hardware
164
-
165
- [More Information Needed]
166
-
167
- #### Software
168
-
169
- [More Information Needed]
170
-
171
- ## Citation [optional]
172
-
173
- <!-- If there is a paper or blog post introducing the model, the APA and Bibtex information for that should go in this section. -->
174
-
175
- **BibTeX:**
176
-
177
- [More Information Needed]
178
-
179
- **APA:**
180
-
181
- [More Information Needed]
182
-
183
- ## Glossary [optional]
184
-
185
- <!-- If relevant, include terms and calculations in this section that can help readers understand the model or model card. -->
186
-
187
- [More Information Needed]
188
-
189
- ## More Information [optional]
190
-
191
- [More Information Needed]
192
-
193
- ## Model Card Authors [optional]
194
-
195
- [More Information Needed]
196
-
197
- ## Model Card Contact
198
-
199
- [More Information Needed]
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
  ---
2
+ tags:
3
+ - generated_from_trainer
4
+ model-index:
5
+ - name: gpt2-lichess-uci-202306-12x12
6
+ results: []
7
  ---
8
 
9
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
10
+ should probably proofread and complete it, then remove this comment. -->
11
+
12
+ # gpt2-lichess-uci-202306-12x12
13
+
14
+ This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
15
+ It achieves the following results on the evaluation set:
16
+ - Loss: 0.8365
17
+
18
+ ## Model description
19
+
20
+ More information needed
21
+
22
+ ## Intended uses & limitations
23
+
24
+ More information needed
25
+
26
+ ## Training and evaluation data
27
+
28
+ More information needed
29
+
30
+ ## Training procedure
31
+
32
+ ### Training hyperparameters
33
+
34
+ The following hyperparameters were used during training:
35
+ - learning_rate: 0.0002
36
+ - train_batch_size: 6
37
+ - eval_batch_size: 8
38
+ - seed: 42
39
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
40
+ - lr_scheduler_type: cosine
41
+ - num_epochs: 1
42
+
43
+ ### Training results
44
+
45
+ | Training Loss | Epoch | Step | Validation Loss |
46
+ |:-------------:|:------:|:-------:|:---------------:|
47
+ | 1.3438 | 0.0036 | 15000 | 1.3198 |
48
+ | 1.2309 | 0.0072 | 30000 | 1.2102 |
49
+ | 1.1738 | 0.0108 | 45000 | 1.1595 |
50
+ | 1.1384 | 0.0144 | 60000 | 1.1267 |
51
+ | 1.115 | 0.0180 | 75000 | 1.0988 |
52
+ | 1.0937 | 0.0216 | 90000 | 1.0820 |
53
+ | 1.0786 | 0.0253 | 105000 | 1.0670 |
54
+ | 1.0649 | 0.0289 | 120000 | 1.0541 |
55
+ | 1.0546 | 0.0325 | 135000 | 1.0429 |
56
+ | 1.0446 | 0.0361 | 150000 | 1.0323 |
57
+ | 1.0388 | 0.0397 | 165000 | 1.0247 |
58
+ | 1.029 | 0.0433 | 180000 | 1.0184 |
59
+ | 1.022 | 0.0469 | 195000 | 1.0127 |
60
+ | 1.0144 | 0.0505 | 210000 | 1.0072 |
61
+ | 1.01 | 0.0541 | 225000 | 1.0008 |
62
+ | 1.0069 | 0.0577 | 240000 | 0.9964 |
63
+ | 1.0033 | 0.0613 | 255000 | 0.9928 |
64
+ | 0.9988 | 0.0649 | 270000 | 0.9867 |
65
+ | 0.9948 | 0.0685 | 285000 | 0.9836 |
66
+ | 0.9892 | 0.0722 | 300000 | 0.9803 |
67
+ | 0.9864 | 0.0758 | 315000 | 0.9764 |
68
+ | 0.9798 | 0.0794 | 330000 | 0.9733 |
69
+ | 0.9804 | 0.0830 | 345000 | 0.9713 |
70
+ | 0.9776 | 0.0866 | 360000 | 0.9684 |
71
+ | 0.9754 | 0.0902 | 375000 | 0.9656 |
72
+ | 0.9736 | 0.0938 | 390000 | 0.9639 |
73
+ | 0.9687 | 0.0974 | 405000 | 0.9609 |
74
+ | 0.9715 | 0.1010 | 420000 | 0.9592 |
75
+ | 0.9644 | 0.1046 | 435000 | 0.9581 |
76
+ | 0.9629 | 0.1082 | 450000 | 0.9550 |
77
+ | 0.9609 | 0.1118 | 465000 | 0.9544 |
78
+ | 0.9589 | 0.1154 | 480000 | 0.9511 |
79
+ | 0.9585 | 0.1190 | 495000 | 0.9493 |
80
+ | 0.9573 | 0.1227 | 510000 | 0.9491 |
81
+ | 0.9533 | 0.1263 | 525000 | 0.9473 |
82
+ | 0.9558 | 0.1299 | 540000 | 0.9453 |
83
+ | 0.9498 | 0.1335 | 555000 | 0.9434 |
84
+ | 0.9487 | 0.1371 | 570000 | 0.9415 |
85
+ | 0.9492 | 0.1407 | 585000 | 0.9411 |
86
+ | 0.9475 | 0.1443 | 600000 | 0.9390 |
87
+ | 0.9443 | 0.1479 | 615000 | 0.9387 |
88
+ | 0.9458 | 0.1515 | 630000 | 0.9372 |
89
+ | 0.9441 | 0.1551 | 645000 | 0.9358 |
90
+ | 0.9438 | 0.1587 | 660000 | 0.9350 |
91
+ | 0.9423 | 0.1623 | 675000 | 0.9341 |
92
+ | 0.9383 | 0.1659 | 690000 | 0.9321 |
93
+ | 0.9395 | 0.1696 | 705000 | 0.9314 |
94
+ | 0.9379 | 0.1732 | 720000 | 0.9301 |
95
+ | 0.9382 | 0.1768 | 735000 | 0.9289 |
96
+ | 0.9374 | 0.1804 | 750000 | 0.9284 |
97
+ | 0.9353 | 0.1840 | 765000 | 0.9278 |
98
+ | 0.9325 | 0.1876 | 780000 | 0.9270 |
99
+ | 0.932 | 0.1912 | 795000 | 0.9255 |
100
+ | 0.9289 | 0.1948 | 810000 | 0.9250 |
101
+ | 0.9324 | 0.1984 | 825000 | 0.9233 |
102
+ | 0.9303 | 0.2020 | 840000 | 0.9230 |
103
+ | 0.9297 | 0.2056 | 855000 | 0.9224 |
104
+ | 0.9279 | 0.2092 | 870000 | 0.9223 |
105
+ | 0.9298 | 0.2128 | 885000 | 0.9202 |
106
+ | 0.9252 | 0.2165 | 900000 | 0.9195 |
107
+ | 0.9259 | 0.2201 | 915000 | 0.9194 |
108
+ | 0.9255 | 0.2237 | 930000 | 0.9186 |
109
+ | 0.9231 | 0.2273 | 945000 | 0.9181 |
110
+ | 0.9258 | 0.2309 | 960000 | 0.9165 |
111
+ | 0.9214 | 0.2345 | 975000 | 0.9162 |
112
+ | 0.924 | 0.2381 | 990000 | 0.9161 |
113
+ | 0.9245 | 0.2417 | 1005000 | 0.9146 |
114
+ | 0.919 | 0.2453 | 1020000 | 0.9147 |
115
+ | 0.9202 | 0.2489 | 1035000 | 0.9133 |
116
+ | 0.92 | 0.2525 | 1050000 | 0.9125 |
117
+ | 0.9161 | 0.2561 | 1065000 | 0.9128 |
118
+ | 0.9174 | 0.2597 | 1080000 | 0.9123 |
119
+ | 0.9176 | 0.2634 | 1095000 | 0.9107 |
120
+ | 0.9177 | 0.2670 | 1110000 | 0.9097 |
121
+ | 0.9178 | 0.2706 | 1125000 | 0.9098 |
122
+ | 0.9144 | 0.2742 | 1140000 | 0.9085 |
123
+ | 0.916 | 0.2778 | 1155000 | 0.9091 |
124
+ | 0.913 | 0.2814 | 1170000 | 0.9088 |
125
+ | 0.9135 | 0.2850 | 1185000 | 0.9075 |
126
+ | 0.9145 | 0.2886 | 1200000 | 0.9059 |
127
+ | 0.9113 | 0.2922 | 1215000 | 0.9064 |
128
+ | 0.9121 | 0.2958 | 1230000 | 0.9060 |
129
+ | 0.9108 | 0.2994 | 1245000 | 0.9052 |
130
+ | 0.909 | 0.3030 | 1260000 | 0.9038 |
131
+ | 0.9127 | 0.3066 | 1275000 | 0.9040 |
132
+ | 0.9104 | 0.3103 | 1290000 | 0.9029 |
133
+ | 0.9097 | 0.3139 | 1305000 | 0.9036 |
134
+ | 0.9086 | 0.3175 | 1320000 | 0.9023 |
135
+ | 0.9078 | 0.3211 | 1335000 | 0.9016 |
136
+ | 0.9077 | 0.3247 | 1350000 | 0.9019 |
137
+ | 0.9058 | 0.3283 | 1365000 | 0.9010 |
138
+ | 0.9053 | 0.3319 | 1380000 | 0.9005 |
139
+ | 0.9069 | 0.3355 | 1395000 | 0.8995 |
140
+ | 0.9057 | 0.3391 | 1410000 | 0.8997 |
141
+ | 0.9037 | 0.3427 | 1425000 | 0.8988 |
142
+ | 0.9038 | 0.3463 | 1440000 | 0.8987 |
143
+ | 0.9029 | 0.3499 | 1455000 | 0.8979 |
144
+ | 0.9012 | 0.3535 | 1470000 | 0.8975 |
145
+ | 0.9003 | 0.3571 | 1485000 | 0.8977 |
146
+ | 0.9019 | 0.3608 | 1500000 | 0.8966 |
147
+ | 0.9018 | 0.3644 | 1515000 | 0.8964 |
148
+ | 0.9008 | 0.3680 | 1530000 | 0.8957 |
149
+ | 0.8997 | 0.3716 | 1545000 | 0.8951 |
150
+ | 0.8996 | 0.3752 | 1560000 | 0.8950 |
151
+ | 0.9 | 0.3788 | 1575000 | 0.8941 |
152
+ | 0.8989 | 0.3824 | 1590000 | 0.8934 |
153
+ | 0.8962 | 0.3860 | 1605000 | 0.8938 |
154
+ | 0.8962 | 0.3896 | 1620000 | 0.8936 |
155
+ | 0.8972 | 0.3932 | 1635000 | 0.8926 |
156
+ | 0.8985 | 0.3968 | 1650000 | 0.8915 |
157
+ | 0.8959 | 0.4004 | 1665000 | 0.8919 |
158
+ | 0.8966 | 0.4040 | 1680000 | 0.8911 |
159
+ | 0.8952 | 0.4077 | 1695000 | 0.8908 |
160
+ | 0.8949 | 0.4113 | 1710000 | 0.8897 |
161
+ | 0.8943 | 0.4149 | 1725000 | 0.8895 |
162
+ | 0.8921 | 0.4185 | 1740000 | 0.8892 |
163
+ | 0.8924 | 0.4221 | 1755000 | 0.8887 |
164
+ | 0.8926 | 0.4257 | 1770000 | 0.8880 |
165
+ | 0.8902 | 0.4293 | 1785000 | 0.8876 |
166
+ | 0.8931 | 0.4329 | 1800000 | 0.8871 |
167
+ | 0.8909 | 0.4365 | 1815000 | 0.8871 |
168
+ | 0.8904 | 0.4401 | 1830000 | 0.8863 |
169
+ | 0.8892 | 0.4437 | 1845000 | 0.8861 |
170
+ | 0.8886 | 0.4473 | 1860000 | 0.8854 |
171
+ | 0.8894 | 0.4509 | 1875000 | 0.8855 |
172
+ | 0.8908 | 0.4546 | 1890000 | 0.8852 |
173
+ | 0.8898 | 0.4582 | 1905000 | 0.8848 |
174
+ | 0.886 | 0.4618 | 1920000 | 0.8842 |
175
+ | 0.8855 | 0.4654 | 1935000 | 0.8830 |
176
+ | 0.8866 | 0.4690 | 1950000 | 0.8825 |
177
+ | 0.8848 | 0.4726 | 1965000 | 0.8823 |
178
+ | 0.8835 | 0.4762 | 1980000 | 0.8825 |
179
+ | 0.8869 | 0.4798 | 1995000 | 0.8814 |
180
+ | 0.885 | 0.4834 | 2010000 | 0.8814 |
181
+ | 0.8869 | 0.4870 | 2025000 | 0.8812 |
182
+ | 0.8842 | 0.4906 | 2040000 | 0.8804 |
183
+ | 0.8865 | 0.4942 | 2055000 | 0.8804 |
184
+ | 0.8837 | 0.4978 | 2070000 | 0.8799 |
185
+ | 0.8799 | 0.5015 | 2085000 | 0.8798 |
186
+ | 0.8824 | 0.5051 | 2100000 | 0.8784 |
187
+ | 0.8821 | 0.5087 | 2115000 | 0.8786 |
188
+ | 0.8802 | 0.5123 | 2130000 | 0.8779 |
189
+ | 0.8811 | 0.5159 | 2145000 | 0.8778 |
190
+ | 0.8805 | 0.5195 | 2160000 | 0.8774 |
191
+ | 0.8837 | 0.5231 | 2175000 | 0.8765 |
192
+ | 0.88 | 0.5267 | 2190000 | 0.8764 |
193
+ | 0.8794 | 0.5303 | 2205000 | 0.8760 |
194
+ | 0.8805 | 0.5339 | 2220000 | 0.8754 |
195
+ | 0.8768 | 0.5375 | 2235000 | 0.8757 |
196
+ | 0.8763 | 0.5411 | 2250000 | 0.8743 |
197
+ | 0.8762 | 0.5447 | 2265000 | 0.8746 |
198
+ | 0.8786 | 0.5484 | 2280000 | 0.8748 |
199
+ | 0.8772 | 0.5520 | 2295000 | 0.8735 |
200
+ | 0.8758 | 0.5556 | 2310000 | 0.8728 |
201
+ | 0.874 | 0.5592 | 2325000 | 0.8724 |
202
+ | 0.8752 | 0.5628 | 2340000 | 0.8722 |
203
+ | 0.8748 | 0.5664 | 2355000 | 0.8712 |
204
+ | 0.8749 | 0.5700 | 2370000 | 0.8714 |
205
+ | 0.874 | 0.5736 | 2385000 | 0.8707 |
206
+ | 0.8728 | 0.5772 | 2400000 | 0.8702 |
207
+ | 0.8742 | 0.5808 | 2415000 | 0.8701 |
208
+ | 0.8683 | 0.5844 | 2430000 | 0.8696 |
209
+ | 0.8693 | 0.5880 | 2445000 | 0.8695 |
210
+ | 0.8717 | 0.5916 | 2460000 | 0.8684 |
211
+ | 0.8697 | 0.5952 | 2475000 | 0.8682 |
212
+ | 0.8734 | 0.5989 | 2490000 | 0.8675 |
213
+ | 0.87 | 0.6025 | 2505000 | 0.8672 |
214
+ | 0.8714 | 0.6061 | 2520000 | 0.8669 |
215
+ | 0.8671 | 0.6097 | 2535000 | 0.8661 |
216
+ | 0.8681 | 0.6133 | 2550000 | 0.8661 |
217
+ | 0.8689 | 0.6169 | 2565000 | 0.8658 |
218
+ | 0.8655 | 0.6205 | 2580000 | 0.8652 |
219
+ | 0.8667 | 0.6241 | 2595000 | 0.8646 |
220
+ | 0.8676 | 0.6277 | 2610000 | 0.8647 |
221
+ | 0.8671 | 0.6313 | 2625000 | 0.8638 |
222
+ | 0.8665 | 0.6349 | 2640000 | 0.8632 |
223
+ | 0.865 | 0.6385 | 2655000 | 0.8632 |
224
+ | 0.8648 | 0.6421 | 2670000 | 0.8627 |
225
+ | 0.8646 | 0.6458 | 2685000 | 0.8616 |
226
+ | 0.8621 | 0.6494 | 2700000 | 0.8619 |
227
+ | 0.8631 | 0.6530 | 2715000 | 0.8621 |
228
+ | 0.8623 | 0.6566 | 2730000 | 0.8609 |
229
+ | 0.8623 | 0.6602 | 2745000 | 0.8606 |
230
+ | 0.8625 | 0.6638 | 2760000 | 0.8602 |
231
+ | 0.8637 | 0.6674 | 2775000 | 0.8595 |
232
+ | 0.8625 | 0.6710 | 2790000 | 0.8594 |
233
+ | 0.8627 | 0.6746 | 2805000 | 0.8587 |
234
+ | 0.8606 | 0.6782 | 2820000 | 0.8584 |
235
+ | 0.8607 | 0.6818 | 2835000 | 0.8578 |
236
+ | 0.8609 | 0.6854 | 2850000 | 0.8575 |
237
+ | 0.8577 | 0.6890 | 2865000 | 0.8570 |
238
+ | 0.8567 | 0.6927 | 2880000 | 0.8567 |
239
+ | 0.8587 | 0.6963 | 2895000 | 0.8560 |
240
+ | 0.8558 | 0.6999 | 2910000 | 0.8561 |
241
+ | 0.8539 | 0.7035 | 2925000 | 0.8556 |
242
+ | 0.8576 | 0.7071 | 2940000 | 0.8553 |
243
+ | 0.8555 | 0.7107 | 2955000 | 0.8546 |
244
+ | 0.8558 | 0.7143 | 2970000 | 0.8545 |
245
+ | 0.8582 | 0.7179 | 2985000 | 0.8539 |
246
+ | 0.8544 | 0.7215 | 3000000 | 0.8540 |
247
+ | 0.8541 | 0.7251 | 3015000 | 0.8536 |
248
+ | 0.8559 | 0.7287 | 3030000 | 0.8531 |
249
+ | 0.8521 | 0.7323 | 3045000 | 0.8526 |
250
+ | 0.8529 | 0.7359 | 3060000 | 0.8521 |
251
+ | 0.8538 | 0.7396 | 3075000 | 0.8515 |
252
+ | 0.8521 | 0.7432 | 3090000 | 0.8512 |
253
+ | 0.8512 | 0.7468 | 3105000 | 0.8507 |
254
+ | 0.8503 | 0.7504 | 3120000 | 0.8503 |
255
+ | 0.8534 | 0.7540 | 3135000 | 0.8503 |
256
+ | 0.8534 | 0.7576 | 3150000 | 0.8501 |
257
+ | 0.8488 | 0.7612 | 3165000 | 0.8494 |
258
+ | 0.85 | 0.7648 | 3180000 | 0.8490 |
259
+ | 0.8483 | 0.7684 | 3195000 | 0.8488 |
260
+ | 0.8481 | 0.7720 | 3210000 | 0.8484 |
261
+ | 0.8516 | 0.7756 | 3225000 | 0.8480 |
262
+ | 0.8483 | 0.7792 | 3240000 | 0.8476 |
263
+ | 0.8504 | 0.7828 | 3255000 | 0.8471 |
264
+ | 0.8485 | 0.7865 | 3270000 | 0.8469 |
265
+ | 0.8461 | 0.7901 | 3285000 | 0.8465 |
266
+ | 0.8458 | 0.7937 | 3300000 | 0.8460 |
267
+ | 0.8467 | 0.7973 | 3315000 | 0.8460 |
268
+ | 0.847 | 0.8009 | 3330000 | 0.8454 |
269
+ | 0.8479 | 0.8045 | 3345000 | 0.8451 |
270
+ | 0.8441 | 0.8081 | 3360000 | 0.8447 |
271
+ | 0.8465 | 0.8117 | 3375000 | 0.8444 |
272
+ | 0.8431 | 0.8153 | 3390000 | 0.8444 |
273
+ | 0.8453 | 0.8189 | 3405000 | 0.8436 |
274
+ | 0.8453 | 0.8225 | 3420000 | 0.8436 |
275
+ | 0.8444 | 0.8261 | 3435000 | 0.8435 |
276
+ | 0.8425 | 0.8297 | 3450000 | 0.8430 |
277
+ | 0.8452 | 0.8333 | 3465000 | 0.8429 |
278
+ | 0.8423 | 0.8370 | 3480000 | 0.8427 |
279
+ | 0.8423 | 0.8406 | 3495000 | 0.8419 |
280
+ | 0.8424 | 0.8442 | 3510000 | 0.8416 |
281
+ | 0.8462 | 0.8478 | 3525000 | 0.8417 |
282
+ | 0.8414 | 0.8514 | 3540000 | 0.8413 |
283
+ | 0.8435 | 0.8550 | 3555000 | 0.8415 |
284
+ | 0.8428 | 0.8586 | 3570000 | 0.8410 |
285
+ | 0.8396 | 0.8622 | 3585000 | 0.8405 |
286
+ | 0.8416 | 0.8658 | 3600000 | 0.8403 |
287
+ | 0.8408 | 0.8694 | 3615000 | 0.8400 |
288
+ | 0.8392 | 0.8730 | 3630000 | 0.8400 |
289
+ | 0.8404 | 0.8766 | 3645000 | 0.8396 |
290
+ | 0.838 | 0.8802 | 3660000 | 0.8396 |
291
+ | 0.8401 | 0.8839 | 3675000 | 0.8392 |
292
+ | 0.84 | 0.8875 | 3690000 | 0.8391 |
293
+ | 0.8401 | 0.8911 | 3705000 | 0.8389 |
294
+ | 0.841 | 0.8947 | 3720000 | 0.8389 |
295
+ | 0.8389 | 0.8983 | 3735000 | 0.8386 |
296
+ | 0.8384 | 0.9019 | 3750000 | 0.8385 |
297
+ | 0.8375 | 0.9055 | 3765000 | 0.8384 |
298
+ | 0.8391 | 0.9091 | 3780000 | 0.8381 |
299
+ | 0.8371 | 0.9127 | 3795000 | 0.8381 |
300
+ | 0.8387 | 0.9163 | 3810000 | 0.8379 |
301
+ | 0.8375 | 0.9199 | 3825000 | 0.8377 |
302
+ | 0.8376 | 0.9235 | 3840000 | 0.8375 |
303
+ | 0.8377 | 0.9271 | 3855000 | 0.8374 |
304
+ | 0.8383 | 0.9308 | 3870000 | 0.8373 |
305
+ | 0.8368 | 0.9344 | 3885000 | 0.8373 |
306
+ | 0.8385 | 0.9380 | 3900000 | 0.8372 |
307
+ | 0.8369 | 0.9416 | 3915000 | 0.8372 |
308
+ | 0.8364 | 0.9452 | 3930000 | 0.8370 |
309
+ | 0.8381 | 0.9488 | 3945000 | 0.8369 |
310
+ | 0.8369 | 0.9524 | 3960000 | 0.8368 |
311
+ | 0.8378 | 0.9560 | 3975000 | 0.8367 |
312
+ | 0.8341 | 0.9596 | 3990000 | 0.8367 |
313
+ | 0.8365 | 0.9632 | 4005000 | 0.8367 |
314
+ | 0.8396 | 0.9668 | 4020000 | 0.8367 |
315
+ | 0.8345 | 0.9704 | 4035000 | 0.8366 |
316
+ | 0.8375 | 0.9740 | 4050000 | 0.8366 |
317
+ | 0.8368 | 0.9777 | 4065000 | 0.8366 |
318
+ | 0.8372 | 0.9813 | 4080000 | 0.8366 |
319
+ | 0.8393 | 0.9849 | 4095000 | 0.8365 |
320
+ | 0.836 | 0.9885 | 4110000 | 0.8365 |
321
+ | 0.8369 | 0.9921 | 4125000 | 0.8365 |
322
+ | 0.8349 | 0.9957 | 4140000 | 0.8365 |
323
+ | 0.8383 | 0.9993 | 4155000 | 0.8365 |
324
+
325
+
326
+ ### Framework versions
327
+
328
+ - Transformers 4.40.1
329
+ - Pytorch 2.3.0
330
+ - Datasets 2.19.1
331
+ - Tokenizers 0.19.1
generation_config.json CHANGED
@@ -1,9 +1,6 @@
1
  {
2
- "do_sample": true,
 
3
  "eos_token_id": 2,
4
- "max_length": 512,
5
- "max_new_tokens": 512,
6
- "pad_token_id": 0,
7
- "temperature": 0.0001,
8
  "transformers_version": "4.40.1"
9
  }
 
1
  {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 1,
4
  "eos_token_id": 2,
 
 
 
 
5
  "transformers_version": "4.40.1"
6
  }
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:55a4c932d6740a6501b3af5f40e636beebe2e7574413afaee4a2d59dbd0f8991
3
  size 342033024
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:282615f24316173a4c1021d79d9d50e618504593e55d2e70c798dbb708217364
3
  size 342033024