End of training
Browse files
README.md
CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- eval_enwikippl:
|
20 |
-
- eval_frwikippl:
|
21 |
-
- eval_zhwikippl:
|
22 |
-
- eval_loss:
|
23 |
-
- eval_runtime: 21.
|
24 |
-
- eval_samples_per_second: 46.
|
25 |
-
- eval_steps_per_second: 11.
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
@@ -45,10 +45,10 @@ More information needed
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
-
- distillation_objective: <distily.objectives.LegacyObjective object at
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
-
- train_batch_size:
|
52 |
- eval_batch_size: 4
|
53 |
- seed: 42
|
54 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
@@ -62,32 +62,57 @@ Peak GPU Memory: 15.7299 GB
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2385 | 57.2728 | | | | | 18.1772 |
|
65 |
-
| 0 | 0 |
|
66 |
-
| 500 | 0.
|
67 |
-
| 1000 | 0.
|
68 |
-
| 1500 | 0.
|
69 |
-
| 2000 | 0.
|
70 |
-
| 2500 | 0.
|
71 |
-
| 3000 | 0.
|
72 |
-
| 3500 | 0.
|
73 |
-
| 4000 | 0.
|
74 |
-
| 4500 | 0.
|
75 |
-
| 5000 | 0.
|
76 |
-
| 5500 | 0.
|
77 |
-
| 6000 | 0.
|
78 |
-
| 6500 | 0.
|
79 |
-
| 7000 | 0.
|
80 |
-
| 7500 | 0.
|
81 |
-
| 8000 | 0.
|
82 |
-
| 8500 | 0.
|
83 |
-
| 9000 | 0.
|
84 |
-
| 9500 | 0.
|
85 |
-
| 10000 | 0.
|
86 |
-
| 10500 | 0.
|
87 |
-
| 11000 | 0.
|
88 |
-
| 11500 | 0.
|
89 |
-
| 12000 | 0.
|
90 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
91 |
|
92 |
### Framework versions
|
93 |
- Distily 0.2.0
|
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- eval_enwikippl: 611.0229
|
20 |
+
- eval_frwikippl: 4057.1116
|
21 |
+
- eval_zhwikippl: 17857.3535
|
22 |
+
- eval_loss: 7319.6802
|
23 |
+
- eval_runtime: 21.6635
|
24 |
+
- eval_samples_per_second: 46.161
|
25 |
+
- eval_steps_per_second: 11.54
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
+
- distillation_objective: <distily.objectives.LegacyObjective object at 0x7f7f68372f20>
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
+
- train_batch_size: 4
|
52 |
- eval_batch_size: 4
|
53 |
- seed: 42
|
54 |
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
|
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2385 | 57.2728 | | | | | 18.1772 |
|
65 |
+
| 0 | 0 | 54063.0 | 57132.0859 | 330301.4375 | 21.4102 | 46.707 | 11.677 | 54288.6797 |
|
66 |
+
| 500 | 0.0202 | 2200.8281 | 11269.5508 | 12821.6318 | 21.5139 | 46.482 | 11.62 | 56764.8438 |
|
67 |
+
| 1000 | 0.0404 | 1665.7762 | 6195.2891 | 10439.8076 | 21.4414 | 46.639 | 11.66 | 26684.8594 |
|
68 |
+
| 1500 | 0.0606 | 1395.5389 | 6104.2388 | 9813.1201 | 21.6061 | 46.283 | 11.571 | 24228.9746 |
|
69 |
+
| 2000 | 0.0808 | 1240.8671 | 5769.0522 | 9419.9043 | 21.4266 | 46.671 | 11.668 | 25033.0820 |
|
70 |
+
| 2500 | 0.1010 | 1090.4764 | 4890.7593 | 9141.3115 | 21.621 | 46.251 | 11.563 | 20563.0996 |
|
71 |
+
| 3000 | 0.1212 | 1002.9867 | 4993.7222 | 8898.1758 | 21.5727 | 46.355 | 11.589 | 19028.0254 |
|
72 |
+
| 3500 | 0.1414 | 938.0837 | 5002.0063 | 8604.9277 | 21.6847 | 46.115 | 11.529 | 16472.4883 |
|
73 |
+
| 4000 | 0.1616 | 887.3153 | 5051.4448 | 8486.5283 | 21.6081 | 46.279 | 11.57 | 16085.5645 |
|
74 |
+
| 4500 | 0.1818 | 845.5585 | 4589.0835 | 8366.6562 | 21.4849 | 46.544 | 11.636 | 15081.8213 |
|
75 |
+
| 5000 | 0.2020 | 793.8573 | 4711.3706 | 8115.8398 | 21.5826 | 46.334 | 11.583 | 17669.9570 |
|
76 |
+
| 5500 | 0.2222 | 769.3093 | 4488.2871 | 8050.4321 | 21.6441 | 46.202 | 11.55 | 14243.0586 |
|
77 |
+
| 6000 | 0.2424 | 728.3535 | 4591.8325 | 7877.5039 | 21.5753 | 46.349 | 11.587 | 17997.4102 |
|
78 |
+
| 6500 | 0.2626 | 710.1467 | 4239.1436 | 7786.5601 | 21.6028 | 46.29 | 11.573 | 13550.0938 |
|
79 |
+
| 7000 | 0.2828 | 687.4926 | 4355.0205 | 7698.4961 | 21.646 | 46.198 | 11.549 | 17154.9648 |
|
80 |
+
| 7500 | 0.3030 | 671.2838 | 4329.4575 | 7575.6479 | 21.6404 | 46.21 | 11.552 | 14209.8184 |
|
81 |
+
| 8000 | 0.3232 | 651.2464 | 4281.1934 | 7469.9521 | 21.5348 | 46.437 | 11.609 | 11029.1934 |
|
82 |
+
| 8500 | 0.3434 | 644.4048 | 4005.5222 | 7476.8638 | 21.677 | 46.132 | 11.533 | 14790.6338 |
|
83 |
+
| 9000 | 0.3636 | 617.8220 | 3900.7300 | 7369.7598 | 21.5714 | 46.358 | 11.589 | 14878.7871 |
|
84 |
+
| 9500 | 0.3838 | 611.0229 | 4057.1116 | 7319.6802 | 21.6635 | 46.161 | 11.54 | 17857.3535 |
|
85 |
+
| 10000 | 0.4040 | 607.8168 | 4144.8633 | 7314.2402 | 21.6408 | 46.209 | 11.552 | 8414.2871 |
|
86 |
+
| 10500 | 0.4242 | 597.2198 | 3698.7766 | 7203.5840 | 21.435 | 46.653 | 11.663 | 6787.8618 |
|
87 |
+
| 11000 | 0.4444 | 583.0488 | 3682.1257 | 7166.8481 | 21.6697 | 46.147 | 11.537 | 8091.4233 |
|
88 |
+
| 11500 | 0.4646 | 571.3620 | 3917.5435 | 7146.4961 | 21.6445 | 46.201 | 11.55 | 6319.4146 |
|
89 |
+
| 12000 | 0.4848 | 573.1504 | 3768.2695 | 7058.4961 | 21.5215 | 46.465 | 11.616 | 7570.3271 |
|
90 |
+
| 12500 | 0.5051 | 566.9861 | 3949.1577 | 7061.6318 | 22.0963 | 45.256 | 11.314 | 8727.2812 |
|
91 |
+
| 13000 | 0.5253 | 558.8017 | 3803.5042 | 6961.4082 | 21.5517 | 46.4 | 11.6 | 9476.1670 |
|
92 |
+
| 13500 | 0.5455 | 550.8849 | 3855.4763 | 7017.2798 | 21.6135 | 46.267 | 11.567 | 11842.0234 |
|
93 |
+
| 14000 | 0.5657 | 546.9738 | 3748.3950 | 6947.0718 | 21.5141 | 46.481 | 11.62 | 11424.9463 |
|
94 |
+
| 14500 | 0.5859 | 535.9684 | 3708.3093 | 6870.6240 | 21.6892 | 46.106 | 11.526 | 9886.5801 |
|
95 |
+
| 15000 | 0.6061 | 528.7446 | 3590.0920 | 6851.8398 | 21.6211 | 46.251 | 11.563 | 14917.5742 |
|
96 |
+
| 15500 | 0.6263 | 521.6382 | 3602.8992 | 6849.3442 | 21.4696 | 46.577 | 11.644 | 10334.7578 |
|
97 |
+
| 16000 | 0.6465 | 517.2215 | 3595.4131 | 6779.7759 | 21.4612 | 46.596 | 11.649 | 13237.1143 |
|
98 |
+
| 16500 | 0.6667 | 518.6794 | 3385.7922 | 6766.4639 | 21.7793 | 45.915 | 11.479 | 10972.6348 |
|
99 |
+
| 17000 | 0.6869 | 515.1771 | 3393.7964 | 6732.6719 | 21.9049 | 45.652 | 11.413 | 9149.7510 |
|
100 |
+
| 17500 | 0.7071 | 501.9872 | 3414.7986 | 6733.2480 | 21.6388 | 46.213 | 11.553 | 6785.5962 |
|
101 |
+
| 18000 | 0.7273 | 501.7729 | 3411.7908 | 6693.3442 | 21.6999 | 46.083 | 11.521 | 5611.2793 |
|
102 |
+
| 18500 | 0.7475 | 497.0320 | 3345.2139 | 6665.5361 | 21.7517 | 45.973 | 11.493 | 6820.1172 |
|
103 |
+
| 19000 | 0.7677 | 494.9137 | 3287.4609 | 6658.4321 | 21.5993 | 46.298 | 11.574 | 8307.6650 |
|
104 |
+
| 19500 | 0.7879 | 489.8856 | 3310.1404 | 6703.3921 | 21.703 | 46.077 | 11.519 | 7612.3970 |
|
105 |
+
| 20000 | 0.8081 | 485.6058 | 3133.0156 | 6542.2720 | 21.6927 | 46.098 | 11.525 | 6796.0293 |
|
106 |
+
| 20500 | 0.8283 | 482.7485 | 3159.1875 | 6622.9761 | 21.8265 | 45.816 | 11.454 | 6659.4741 |
|
107 |
+
| 21000 | 0.8485 | 469.4856 | 3162.3074 | 6523.4561 | 21.7088 | 46.064 | 11.516 | 11412.7490 |
|
108 |
+
| 21500 | 0.8687 | 475.0967 | 3214.5735 | 6542.4639 | 21.8362 | 45.796 | 11.449 | 8429.4746 |
|
109 |
+
| 22000 | 0.8889 | 471.5591 | 3336.0288 | 6522.2080 | 21.7328 | 46.013 | 11.503 | 5576.1724 |
|
110 |
+
| 22500 | 0.9091 | 467.7843 | 3244.9744 | 6460.9600 | 21.7546 | 45.967 | 11.492 | 3847.1572 |
|
111 |
+
| 23000 | 0.9293 | 459.4413 | 3435.3245 | 6459.0400 | 21.7279 | 46.024 | 11.506 | 5404.9829 |
|
112 |
+
| 23500 | 0.9495 | 466.1707 | 3223.0857 | 6445.9839 | 21.8014 | 45.869 | 11.467 | 6019.9951 |
|
113 |
+
| 24000 | 0.9697 | 451.2469 | 3094.4868 | 6465.6318 | 21.8221 | 45.825 | 11.456 | 5374.3945 |
|
114 |
+
| 24500 | 0.9899 | 461.2016 | 3168.5593 | 6462.3042 | 21.5588 | 46.385 | 11.596 | 7845.6411 |
|
115 |
+
| 24750 | 1.0 | 453.6186 | 3034.7439 | 6427.3281 | 21.8641 | 45.737 | 11.434 | 7226.5781 |
|
116 |
|
117 |
### Framework versions
|
118 |
- Distily 0.2.0
|
logs/per_device_train_batch_size=4/events.out.tfevents.1723370138.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:eae42ba4b31f101de5dfd727dea08ce460ab7bcd606f625aa06360d79f5f640c
|
3 |
+
size 253
|