End of training
Browse files
README.md
CHANGED
@@ -15,14 +15,14 @@ This student model is distilled from the teacher model [roneneldan/TinyStories-3
|
|
15 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
16 |
|
17 |
It achieves the following results on the evaluation set:
|
18 |
-
- eval_enwikippl:
|
19 |
-
- eval_frwikippl:
|
20 |
-
- eval_zhwikippl:
|
21 |
-
- eval_tinystoriesppl:
|
22 |
-
- eval_loss:
|
23 |
-
- eval_runtime: 6.
|
24 |
-
- eval_samples_per_second:
|
25 |
-
- eval_steps_per_second: 9.
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
@@ -47,7 +47,7 @@ More information needed
|
|
47 |
The following hyperparameters were used during training:
|
48 |
- distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=0, loss_fn=None, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=0, loss_fn=None, layer_mapper=None, projector=None))
|
49 |
- train_embeddings: True
|
50 |
-
- learning_rate: 0.
|
51 |
- train_batch_size: 1
|
52 |
- eval_batch_size: 8
|
53 |
- seed: 42
|
@@ -62,106 +62,106 @@ Peak GPU Memory: 6.6058 GB
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
|
65 |
-
| 0 | 0 | 23232.2363 | 111004.0469 | 6.4068 | 6.
|
66 |
-
| 500 | 0.0101 |
|
67 |
-
| 1000 | 0.0202 |
|
68 |
-
| 1500 | 0.0303 |
|
69 |
-
| 2000 | 0.0404 |
|
70 |
-
| 2500 | 0.0505 |
|
71 |
-
| 3000 | 0.0606 |
|
72 |
-
| 3500 | 0.0707 |
|
73 |
-
| 4000 | 0.0808 |
|
74 |
-
| 4500 | 0.0909 |
|
75 |
-
| 5000 | 0.1010 |
|
76 |
-
| 5500 | 0.1111 |
|
77 |
-
| 6000 | 0.1212 |
|
78 |
-
| 6500 | 0.1313 |
|
79 |
-
| 7000 | 0.1414 |
|
80 |
-
| 7500 | 0.1515 |
|
81 |
-
| 8000 | 0.1616 |
|
82 |
-
| 8500 | 0.1717 |
|
83 |
-
| 9000 | 0.1818 |
|
84 |
-
| 9500 | 0.1919 |
|
85 |
-
| 10000 | 0.2020 |
|
86 |
-
| 10500 | 0.2121 |
|
87 |
-
| 11000 | 0.2222 |
|
88 |
-
| 11500 | 0.2323 |
|
89 |
-
| 12000 | 0.2424 |
|
90 |
-
| 12500 | 0.2525 |
|
91 |
-
| 13000 | 0.2626 |
|
92 |
-
| 13500 | 0.2727 |
|
93 |
-
| 14000 | 0.2828 |
|
94 |
-
| 14500 | 0.2929 |
|
95 |
-
| 15000 | 0.3030 |
|
96 |
-
| 15500 | 0.3131 |
|
97 |
-
| 16000 | 0.3232 |
|
98 |
-
| 16500 | 0.3333 |
|
99 |
-
| 17000 | 0.3434 |
|
100 |
-
| 17500 | 0.3535 |
|
101 |
-
| 18000 | 0.3636 |
|
102 |
-
| 18500 | 0.3737 |
|
103 |
-
| 19000 | 0.3838 |
|
104 |
-
| 19500 | 0.3939 |
|
105 |
-
| 20000 | 0.4040 |
|
106 |
-
| 20500 | 0.4141 |
|
107 |
-
| 21000 | 0.4242 |
|
108 |
-
| 21500 | 0.4343 |
|
109 |
-
| 22000 | 0.4444 |
|
110 |
-
| 22500 | 0.4545 |
|
111 |
-
| 23000 | 0.4646 |
|
112 |
-
| 23500 | 0.4747 |
|
113 |
-
| 24000 | 0.4848 |
|
114 |
-
| 24500 | 0.4949 |
|
115 |
-
| 25000 | 0.5051 |
|
116 |
-
| 25500 | 0.5152 |
|
117 |
-
| 26000 | 0.5253 |
|
118 |
-
| 26500 | 0.5354 |
|
119 |
-
| 27000 | 0.5455 |
|
120 |
-
| 27500 | 0.5556 |
|
121 |
-
| 28000 | 0.5657 |
|
122 |
-
| 28500 | 0.5758 |
|
123 |
-
| 29000 | 0.5859 |
|
124 |
-
| 29500 | 0.5960 |
|
125 |
-
| 30000 | 0.6061 |
|
126 |
-
| 30500 | 0.6162 |
|
127 |
-
| 31000 | 0.6263 |
|
128 |
-
| 31500 | 0.6364 |
|
129 |
-
| 32000 | 0.6465 |
|
130 |
-
| 32500 | 0.6566 |
|
131 |
-
| 33000 | 0.6667 |
|
132 |
-
| 33500 | 0.6768 |
|
133 |
-
| 34000 | 0.6869 |
|
134 |
-
| 34500 | 0.6970 |
|
135 |
-
| 35000 | 0.7071 |
|
136 |
-
| 35500 | 0.7172 |
|
137 |
-
| 36000 | 0.7273 |
|
138 |
-
| 36500 | 0.7374 |
|
139 |
-
| 37000 | 0.7475 |
|
140 |
-
| 37500 | 0.7576 |
|
141 |
-
| 38000 | 0.7677 |
|
142 |
-
| 38500 | 0.7778 |
|
143 |
-
| 39000 | 0.7879 |
|
144 |
-
| 39500 | 0.7980 |
|
145 |
-
| 40000 | 0.8081 |
|
146 |
-
| 40500 | 0.8182 |
|
147 |
-
| 41000 | 0.8283 |
|
148 |
-
| 41500 | 0.8384 |
|
149 |
-
| 42000 | 0.8485 |
|
150 |
-
| 42500 | 0.8586 |
|
151 |
-
| 43000 | 0.8687 |
|
152 |
-
| 43500 | 0.8788 |
|
153 |
-
| 44000 | 0.8889 |
|
154 |
-
| 44500 | 0.8990 |
|
155 |
-
| 45000 | 0.9091 |
|
156 |
-
| 45500 | 0.9192 |
|
157 |
-
| 46000 | 0.9293 |
|
158 |
-
| 46500 | 0.9394 |
|
159 |
-
| 47000 | 0.9495 |
|
160 |
-
| 47500 | 0.9596 |
|
161 |
-
| 48000 | 0.9697 |
|
162 |
-
| 48500 | 0.9798 |
|
163 |
-
| 49000 | 0.9899 |
|
164 |
-
| 49500 | 1.0 |
|
165 |
|
166 |
### Framework versions
|
167 |
- Distily 0.2.0
|
|
|
15 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
16 |
|
17 |
It achieves the following results on the evaluation set:
|
18 |
+
- eval_enwikippl: 203.4302
|
19 |
+
- eval_frwikippl: 112507.4219
|
20 |
+
- eval_zhwikippl: 1544401.25
|
21 |
+
- eval_tinystoriesppl: 10.4258
|
22 |
+
- eval_loss: 1.2132
|
23 |
+
- eval_runtime: 6.4934
|
24 |
+
- eval_samples_per_second: 77.001
|
25 |
+
- eval_steps_per_second: 9.702
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
|
|
47 |
The following hyperparameters were used during training:
|
48 |
- distillation_objective: DistillationObjective(logits_loss_component=LossComponent(label=logits, weight=1, loss_fn=kl, layer_mapper=None, projector=None), hs_loss_component=LossComponent(label=hs, weight=0, loss_fn=None, layer_mapper=None, projector=None), attn_loss_component=LossComponent(label=attn, weight=0, loss_fn=None, layer_mapper=None, projector=None))
|
49 |
- train_embeddings: True
|
50 |
+
- learning_rate: 0.004
|
51 |
- train_batch_size: 1
|
52 |
- eval_batch_size: 8
|
53 |
- seed: 42
|
|
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | tinystoriesppl | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 169.9865 | 47377.9414 | | | | | 3.9789 | 4998.1294 |
|
65 |
+
| 0 | 0 | 23232.2363 | 111004.0469 | 6.4068 | 6.5144 | 76.753 | 9.671 | 9550.5166 | 102446.0156 |
|
66 |
+
| 500 | 0.0101 | 286.9499 | 206403.0312 | 1.3987 | 6.5264 | 76.612 | 9.653 | 11.3159 | 2106274.75 |
|
67 |
+
| 1000 | 0.0202 | 213.8942 | 135439.8906 | 1.2477 | 6.5099 | 76.806 | 9.678 | 10.3927 | 1507759.375 |
|
68 |
+
| 1500 | 0.0303 | 202.3927 | 107609.4375 | 1.2191 | 6.4989 | 76.936 | 9.694 | 10.4513 | 1709189.875 |
|
69 |
+
| 2000 | 0.0404 | 204.6472 | 114618.8281 | 1.2146 | 6.5201 | 76.686 | 9.662 | 10.4206 | 1537001.75 |
|
70 |
+
| 2500 | 0.0505 | 208.5117 | 122800.5781 | 1.2144 | 6.5244 | 76.636 | 9.656 | 10.4021 | 1744199.25 |
|
71 |
+
| 3000 | 0.0606 | 203.1310 | 112761.25 | 1.2126 | 6.5031 | 76.887 | 9.688 | 10.3665 | 1447065.25 |
|
72 |
+
| 3500 | 0.0707 | 205.6802 | 116737.0 | 1.2140 | 6.5148 | 76.749 | 9.67 | 10.3661 | 1639513.0 |
|
73 |
+
| 4000 | 0.0808 | 206.5504 | 110652.8281 | 1.2139 | 6.5134 | 76.765 | 9.672 | 10.4681 | 1485400.5 |
|
74 |
+
| 4500 | 0.0909 | 205.1075 | 109474.6094 | 1.2136 | 6.5428 | 76.42 | 9.629 | 10.4159 | 1445137.25 |
|
75 |
+
| 5000 | 0.1010 | 203.7140 | 110279.3672 | 1.2138 | 6.5746 | 76.05 | 9.582 | 10.4668 | 1533724.75 |
|
76 |
+
| 5500 | 0.1111 | 205.5050 | 114489.6953 | 1.2128 | 6.5408 | 76.444 | 9.632 | 10.4305 | 1672203.5 |
|
77 |
+
| 6000 | 0.1212 | 202.5809 | 106884.2891 | 1.2136 | 6.5086 | 76.822 | 9.68 | 10.4206 | 1468064.25 |
|
78 |
+
| 6500 | 0.1313 | 202.1420 | 113494.2344 | 1.2131 | 6.4942 | 76.991 | 9.701 | 10.3451 | 1576036.75 |
|
79 |
+
| 7000 | 0.1414 | 206.9188 | 116819.1875 | 1.2130 | 6.4926 | 77.011 | 9.703 | 10.4310 | 1638637.625 |
|
80 |
+
| 7500 | 0.1515 | 203.4144 | 107367.2109 | 1.2134 | 6.5245 | 76.634 | 9.656 | 10.4859 | 1450158.5 |
|
81 |
+
| 8000 | 0.1616 | 203.0209 | 113174.9531 | 1.2135 | 6.5095 | 76.811 | 9.678 | 10.3033 | 1514209.25 |
|
82 |
+
| 8500 | 0.1717 | 199.6443 | 100468.0781 | 1.2131 | 6.5236 | 76.645 | 9.657 | 10.4392 | 1311043.875 |
|
83 |
+
| 9000 | 0.1818 | 200.8854 | 108095.5391 | 1.2137 | 6.5111 | 76.792 | 9.676 | 10.2138 | 1448225.0 |
|
84 |
+
| 9500 | 0.1919 | 203.4302 | 112507.4219 | 1.2132 | 6.4934 | 77.001 | 9.702 | 10.4258 | 1544401.25 |
|
85 |
+
| 10000 | 0.2020 | 205.1313 | 110403.75 | 1.2134 | 6.5021 | 76.898 | 9.689 | 10.4660 | 1595496.875 |
|
86 |
+
| 10500 | 0.2121 | 204.6551 | 114103.3906 | 1.2130 | 6.4999 | 76.924 | 9.692 | 10.3361 | 1507759.375 |
|
87 |
+
| 11000 | 0.2222 | 205.3062 | 115477.6953 | 1.2134 | 6.545 | 76.394 | 9.626 | 10.4582 | 1510175.625 |
|
88 |
+
| 11500 | 0.2323 | 203.7376 | 113942.7812 | 1.2127 | 6.5079 | 76.83 | 9.681 | 10.2722 | 1610896.125 |
|
89 |
+
| 12000 | 0.2424 | 202.9187 | 112713.6172 | 1.2131 | 6.5133 | 76.766 | 9.673 | 10.2812 | 1613476.0 |
|
90 |
+
| 12500 | 0.2525 | 206.7906 | 114135.5 | 1.2134 | 6.523 | 76.652 | 9.658 | 10.4301 | 1588700.75 |
|
91 |
+
| 13000 | 0.2626 | 204.8058 | 115885.1172 | 1.2133 | 6.4895 | 77.047 | 9.708 | 10.4284 | 1610896.125 |
|
92 |
+
| 13500 | 0.2727 | 204.1484 | 111434.8906 | 1.2131 | 6.5086 | 76.821 | 9.68 | 10.4841 | 1585313.625 |
|
93 |
+
| 14000 | 0.2828 | 205.4413 | 112888.3203 | 1.2134 | 6.5065 | 76.846 | 9.683 | 10.4280 | 1526377.875 |
|
94 |
+
| 14500 | 0.2929 | 205.4494 | 112856.5703 | 1.2134 | 6.495 | 76.982 | 9.7 | 10.4262 | 1574355.75 |
|
95 |
+
| 15000 | 0.3030 | 205.5369 | 111938.2734 | 1.2127 | 6.4958 | 76.973 | 9.699 | 10.4008 | 1513402.25 |
|
96 |
+
| 15500 | 0.3131 | 203.0129 | 113366.3672 | 1.2131 | 6.4902 | 77.039 | 9.707 | 10.2892 | 1587854.125 |
|
97 |
+
| 16000 | 0.3232 | 205.5687 | 112824.8203 | 1.2131 | 6.4883 | 77.061 | 9.71 | 10.4383 | 1514209.25 |
|
98 |
+
| 16500 | 0.3333 | 204.0773 | 109289.7656 | 1.2133 | 6.5018 | 76.901 | 9.69 | 10.3910 | 1506954.375 |
|
99 |
+
| 17000 | 0.3434 | 203.9982 | 112824.8203 | 1.2133 | 6.4897 | 77.046 | 9.708 | 10.4482 | 1531272.375 |
|
100 |
+
| 17500 | 0.3535 | 206.1109 | 113558.2188 | 1.2127 | 6.4981 | 76.946 | 9.695 | 10.3738 | 1565141.625 |
|
101 |
+
| 18000 | 0.3636 | 202.1342 | 104355.2578 | 1.2131 | 6.5441 | 76.405 | 9.627 | 10.5219 | 1446294.0 |
|
102 |
+
| 18500 | 0.3737 | 204.3779 | 110777.6328 | 1.2133 | 6.5069 | 76.842 | 9.682 | 10.4392 | 1523935.75 |
|
103 |
+
| 19000 | 0.3838 | 206.5424 | 116162.8438 | 1.2127 | 6.5099 | 76.806 | 9.678 | 10.3374 | 1635144.0 |
|
104 |
+
| 19500 | 0.3939 | 204.2829 | 112127.7031 | 1.2129 | 6.5354 | 76.507 | 9.64 | 10.4223 | 1463371.75 |
|
105 |
+
| 20000 | 0.4040 | 202.3927 | 108844.1641 | 1.2132 | 6.5126 | 76.775 | 9.674 | 10.4413 | 1544401.25 |
|
106 |
+
| 20500 | 0.4141 | 202.5260 | 112777.1641 | 1.2129 | 6.5646 | 76.166 | 9.597 | 10.2214 | 1525562.875 |
|
107 |
+
| 21000 | 0.4242 | 205.3539 | 112380.6719 | 1.2131 | 6.5114 | 76.788 | 9.675 | 10.4495 | 1508563.375 |
|
108 |
+
| 21500 | 0.4343 | 204.9645 | 111749.2656 | 1.2127 | 6.4962 | 76.968 | 9.698 | 10.4219 | 1483816.125 |
|
109 |
+
| 22000 | 0.4444 | 202.1969 | 112269.9062 | 1.2133 | 6.4999 | 76.924 | 9.692 | 10.2625 | 1589549.5 |
|
110 |
+
| 22500 | 0.4545 | 207.1032 | 116425.0312 | 1.2134 | 6.4975 | 76.952 | 9.696 | 10.4021 | 1580246.375 |
|
111 |
+
| 23000 | 0.4646 | 202.7693 | 111560.5781 | 1.2130 | 6.516 | 76.734 | 9.669 | 10.3301 | 1416885.75 |
|
112 |
+
| 23500 | 0.4747 | 205.1631 | 115266.4453 | 1.2131 | 6.5106 | 76.798 | 9.676 | 10.3144 | 1592945.75 |
|
113 |
+
| 24000 | 0.4848 | 202.1498 | 110730.8438 | 1.2127 | 6.5345 | 76.516 | 9.641 | 10.2926 | 1512594.25 |
|
114 |
+
| 24500 | 0.4949 | 203.0680 | 111011.8828 | 1.2129 | 6.5175 | 76.717 | 9.666 | 10.4000 | 1447451.75 |
|
115 |
+
| 25000 | 0.5051 | 206.4065 | 112254.0625 | 1.2134 | 6.5182 | 76.708 | 9.665 | 10.4297 | 1569322.125 |
|
116 |
+
| 25500 | 0.5152 | 202.6124 | 107427.6406 | 1.2126 | 6.4893 | 77.05 | 9.708 | 10.4262 | 1486192.625 |
|
117 |
+
| 26000 | 0.5253 | 202.0011 | 105775.9453 | 1.2127 | 6.4913 | 77.026 | 9.705 | 10.4060 | 1452093.125 |
|
118 |
+
| 26500 | 0.5354 | 202.9501 | 108034.6328 | 1.2127 | 6.5044 | 76.871 | 9.686 | 10.3712 | 1466498.375 |
|
119 |
+
| 27000 | 0.5455 | 202.7301 | 108095.5391 | 1.2132 | 6.4959 | 76.971 | 9.698 | 10.3665 | 1505347.0 |
|
120 |
+
| 27500 | 0.5556 | 205.7360 | 112507.4219 | 1.2127 | 6.5028 | 76.89 | 9.688 | 10.4413 | 1565141.625 |
|
121 |
+
| 28000 | 0.5657 | 202.0011 | 108034.6328 | 1.2129 | 6.4956 | 76.975 | 9.699 | 10.3837 | 1484607.375 |
|
122 |
+
| 28500 | 0.5758 | 203.2096 | 108706.2969 | 1.2129 | 6.489 | 77.053 | 9.709 | 10.4331 | 1527191.875 |
|
123 |
+
| 29000 | 0.5859 | 203.6351 | 110699.5859 | 1.2126 | 6.5202 | 76.685 | 9.662 | 10.4094 | 1558474.75 |
|
124 |
+
| 29500 | 0.5960 | 202.6594 | 107065.1641 | 1.2134 | 6.5114 | 76.789 | 9.675 | 10.3944 | 1468064.25 |
|
125 |
+
| 30000 | 0.6061 | 202.6280 | 107306.7109 | 1.2129 | 6.5207 | 76.678 | 9.661 | 10.4103 | 1475918.5 |
|
126 |
+
| 30500 | 0.6162 | 202.7379 | 108645.0469 | 1.2128 | 6.5153 | 76.743 | 9.67 | 10.4185 | 1483816.125 |
|
127 |
+
| 31000 | 0.6263 | 203.3198 | 109120.5234 | 1.2133 | 6.5273 | 76.602 | 9.652 | 10.4301 | 1528822.5 |
|
128 |
+
| 31500 | 0.6364 | 203.8245 | 111875.3047 | 1.2126 | 6.4916 | 77.022 | 9.705 | 10.4008 | 1555152.125 |
|
129 |
+
| 32000 | 0.6465 | 202.9107 | 107730.7109 | 1.2131 | 6.5232 | 76.65 | 9.658 | 10.4219 | 1534544.125 |
|
130 |
+
| 32500 | 0.6566 | 201.8682 | 108339.4062 | 1.2131 | 6.5251 | 76.627 | 9.655 | 10.3674 | 1543578.125 |
|
131 |
+
| 33000 | 0.6667 | 203.7298 | 109922.6797 | 1.2126 | 6.5005 | 76.917 | 9.692 | 10.3948 | 1507759.375 |
|
132 |
+
| 33500 | 0.6768 | 203.3198 | 109582.6172 | 1.2127 | 6.5091 | 76.815 | 9.679 | 10.3961 | 1475918.5 |
|
133 |
+
| 34000 | 0.6869 | 204.9724 | 111309.4609 | 1.2127 | 6.5069 | 76.842 | 9.682 | 10.4452 | 1569322.125 |
|
134 |
+
| 34500 | 0.6970 | 203.1310 | 108828.9062 | 1.2128 | 6.5102 | 76.802 | 9.677 | 10.3918 | 1510175.625 |
|
135 |
+
| 35000 | 0.7071 | 202.9815 | 109197.3516 | 1.2128 | 6.5175 | 76.716 | 9.666 | 10.3639 | 1523935.75 |
|
136 |
+
| 35500 | 0.7172 | 203.1467 | 109135.9297 | 1.2131 | 6.4917 | 77.021 | 9.705 | 10.4056 | 1473558.5 |
|
137 |
+
| 36000 | 0.7273 | 203.3513 | 109151.2266 | 1.2131 | 6.4872 | 77.075 | 9.711 | 10.4163 | 1507759.375 |
|
138 |
+
| 36500 | 0.7374 | 202.7301 | 110808.8047 | 1.2131 | 6.5038 | 76.878 | 9.687 | 10.3554 | 1549353.625 |
|
139 |
+
| 37000 | 0.7475 | 202.9422 | 109258.9141 | 1.2129 | 6.4954 | 76.978 | 9.699 | 10.3987 | 1519876.25 |
|
140 |
+
| 37500 | 0.7576 | 203.2096 | 110139.6875 | 1.2127 | 6.4839 | 77.114 | 9.716 | 10.4038 | 1506954.375 |
|
141 |
+
| 38000 | 0.7677 | 204.0456 | 109860.7422 | 1.2125 | 6.4926 | 77.011 | 9.703 | 10.4392 | 1483024.0 |
|
142 |
+
| 38500 | 0.7778 | 203.1624 | 109197.3516 | 1.2129 | 6.5015 | 76.905 | 9.69 | 10.3837 | 1487779.5 |
|
143 |
+
| 39000 | 0.7879 | 203.2883 | 109274.3359 | 1.2126 | 6.5062 | 76.85 | 9.683 | 10.4038 | 1519064.75 |
|
144 |
+
| 39500 | 0.7980 | 203.1152 | 109089.8281 | 1.2129 | 6.5028 | 76.89 | 9.688 | 10.4211 | 1506151.25 |
|
145 |
+
| 40000 | 0.8081 | 203.2411 | 109551.6875 | 1.2128 | 6.5112 | 76.79 | 9.676 | 10.4129 | 1498136.125 |
|
146 |
+
| 40500 | 0.8182 | 203.1467 | 109767.9531 | 1.2127 | 6.4904 | 77.037 | 9.707 | 10.4167 | 1483816.125 |
|
147 |
+
| 41000 | 0.8283 | 202.9107 | 108767.5859 | 1.2131 | 6.4957 | 76.974 | 9.699 | 10.4017 | 1498935.0 |
|
148 |
+
| 41500 | 0.8384 | 202.9736 | 109151.2266 | 1.2126 | 6.4931 | 77.005 | 9.703 | 10.4038 | 1485400.5 |
|
149 |
+
| 42000 | 0.8485 | 203.2883 | 109644.3984 | 1.2127 | 6.5023 | 76.895 | 9.689 | 10.4051 | 1476707.0 |
|
150 |
+
| 42500 | 0.8586 | 202.5966 | 108767.5859 | 1.2126 | 6.4932 | 77.003 | 9.702 | 10.4069 | 1475918.5 |
|
151 |
+
| 43000 | 0.8687 | 204.2354 | 110668.4453 | 1.2127 | 6.4979 | 76.948 | 9.695 | 10.4211 | 1520686.625 |
|
152 |
+
| 43500 | 0.8788 | 203.8402 | 111403.5469 | 1.2124 | 6.5058 | 76.855 | 9.684 | 10.3708 | 1523123.625 |
|
153 |
+
| 44000 | 0.8889 | 204.0615 | 111215.3359 | 1.2126 | 6.5458 | 76.384 | 9.624 | 10.3884 | 1531272.375 |
|
154 |
+
| 44500 | 0.8990 | 203.8245 | 110263.9062 | 1.2126 | 6.5295 | 76.576 | 9.649 | 10.3875 | 1507759.375 |
|
155 |
+
| 45000 | 0.9091 | 203.8718 | 110668.4453 | 1.2125 | 6.4991 | 76.934 | 9.694 | 10.3854 | 1514209.25 |
|
156 |
+
| 45500 | 0.9192 | 203.0051 | 109953.7188 | 1.2126 | 6.5266 | 76.609 | 9.653 | 10.3824 | 1494942.0 |
|
157 |
+
| 46000 | 0.9293 | 203.7772 | 110606.0938 | 1.2126 | 6.5355 | 76.506 | 9.64 | 10.4064 | 1504544.75 |
|
158 |
+
| 46500 | 0.9394 | 203.8087 | 110606.0938 | 1.2127 | 6.5263 | 76.613 | 9.653 | 10.4051 | 1504544.75 |
|
159 |
+
| 47000 | 0.9495 | 203.3671 | 109953.7188 | 1.2126 | 6.5115 | 76.788 | 9.675 | 10.3957 | 1504544.75 |
|
160 |
+
| 47500 | 0.9596 | 203.6351 | 110357.1172 | 1.2126 | 6.4982 | 76.945 | 9.695 | 10.4004 | 1506954.375 |
|
161 |
+
| 48000 | 0.9697 | 203.2254 | 110512.6719 | 1.2124 | 6.5168 | 76.725 | 9.667 | 10.3837 | 1504544.75 |
|
162 |
+
| 48500 | 0.9798 | 203.6509 | 110606.0938 | 1.2124 | 6.49 | 77.041 | 9.707 | 10.4060 | 1504544.75 |
|
163 |
+
| 49000 | 0.9899 | 203.7140 | 110606.0938 | 1.2126 | 6.4938 | 76.997 | 9.702 | 10.4090 | 1507759.375 |
|
164 |
+
| 49500 | 1.0 | 203.7456 | 110606.0938 | 1.2126 | 6.5193 | 76.695 | 9.664 | 10.4090 | 1507759.375 |
|
165 |
|
166 |
### Framework versions
|
167 |
- Distily 0.2.0
|
logs/dropout=0, learning_rate=0.004, per_device_train_batch_size=1, weight_decay=0.1/events.out.tfevents.1723919839.5f530b1cf724
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:855cf4b1d2b45012d36fafc0aa15ad24881fed5b7a83491a525423c919468fa2
|
3 |
+
size 312
|