End of training
Browse files
README.md
CHANGED
@@ -16,13 +16,13 @@ This student model is distilled from the teacher model [gpt2](https://huggingfac
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
-
- eval_enwikippl:
|
20 |
-
- eval_frwikippl:
|
21 |
-
- eval_zhwikippl:
|
22 |
-
- eval_loss:
|
23 |
-
- eval_runtime:
|
24 |
-
- eval_samples_per_second: 58.
|
25 |
-
- eval_steps_per_second: 7.
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
@@ -45,7 +45,7 @@ More information needed
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
-
- distillation_objective: MultiObjective(logits_weight=1, logits_loss_fn=(fn:kl_divergence_loss()), activations_weight=0, activations_loss_fn=(fn:mse_loss()), attentions_weight=0, attentions_loss_fn=(fn:mse_loss()))
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
- train_batch_size: 8
|
@@ -56,38 +56,38 @@ The following hyperparameters were used during training:
|
|
56 |
- num_epochs: 1.0
|
57 |
|
58 |
### Resource Usage
|
59 |
-
Peak GPU Memory:
|
60 |
|
61 |
### Eval-Phase Metrics
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
|
65 |
-
| 0 | 0 |
|
66 |
-
| 1000 | 0.0404 |
|
67 |
-
| 2000 | 0.0808 |
|
68 |
-
| 3000 | 0.1212 |
|
69 |
-
| 4000 | 0.1616 |
|
70 |
-
| 5000 | 0.2020 |
|
71 |
-
| 6000 | 0.2424 |
|
72 |
-
| 7000 | 0.2828 |
|
73 |
-
| 8000 | 0.3232 |
|
74 |
-
| 9000 | 0.3636 |
|
75 |
-
| 10000 | 0.4040 |
|
76 |
-
| 11000 | 0.4444 |
|
77 |
-
| 12000 | 0.4848 |
|
78 |
-
| 13000 | 0.5253 |
|
79 |
-
| 14000 | 0.5657 |
|
80 |
-
| 15000 | 0.6061 |
|
81 |
-
| 16000 | 0.6465 |
|
82 |
-
| 17000 | 0.6869 |
|
83 |
-
| 18000 | 0.7273 |
|
84 |
-
| 19000 | 0.7677 |
|
85 |
-
| 20000 | 0.8081 |
|
86 |
-
| 21000 | 0.8485 |
|
87 |
-
| 22000 | 0.8889 |
|
88 |
-
| 23000 | 0.9293 |
|
89 |
-
| 24000 | 0.9697 |
|
90 |
-
| 24750 | 1.0 |
|
91 |
|
92 |
### Framework versions
|
93 |
- Distily 0.2.0
|
|
|
16 |
The [Distily](https://github.com/lapp0/distily) library was used for this distillation.
|
17 |
|
18 |
It achieves the following results on the evaluation set:
|
19 |
+
- eval_enwikippl: 433.0859
|
20 |
+
- eval_frwikippl: 2823.5620
|
21 |
+
- eval_zhwikippl: 4932.8379
|
22 |
+
- eval_loss: 21.1035
|
23 |
+
- eval_runtime: 34.4485
|
24 |
+
- eval_samples_per_second: 58.058
|
25 |
+
- eval_steps_per_second: 7.257
|
26 |
|
27 |
<!-- This model card has been generated automatically according to the information the Trainer had access to. You
|
28 |
should probably proofread and complete it, then remove this comment.
|
|
|
45 |
### Training hyperparameters
|
46 |
|
47 |
The following hyperparameters were used during training:
|
48 |
+
- distillation_objective: MultiObjective(logits_weight=1, logits_loss_fn=(fn:kl_divergence_loss()), activations_weight=0.1, activations_loss_fn=(fn:mse_loss()), attentions_weight=0, attentions_loss_fn=(fn:mse_loss()))
|
49 |
- train_embeddings: True
|
50 |
- learning_rate: 4e-05
|
51 |
- train_batch_size: 8
|
|
|
56 |
- num_epochs: 1.0
|
57 |
|
58 |
### Resource Usage
|
59 |
+
Peak GPU Memory: 8.0893 GB
|
60 |
|
61 |
### Eval-Phase Metrics
|
62 |
| step | epoch | enwikippl | frwikippl | loss | runtime | samples_per_second | steps_per_second | zhwikippl |
|
63 |
| --- | --- | --- | --- | --- | --- | --- | --- | --- |
|
64 |
| **teacher eval** | | 30.2086 | 57.2728 | | | | | 18.1784 |
|
65 |
+
| 0 | 0 | 54069.2930 | 57285.3438 | 69.6280 | 34.3114 | 58.29 | 7.286 | 54227.1016 |
|
66 |
+
| 1000 | 0.0404 | 1149.4497 | 6758.9292 | 22.9270 | 34.3626 | 58.203 | 7.275 | 55191.4258 |
|
67 |
+
| 2000 | 0.0808 | 848.3209 | 5094.3662 | 22.2020 | 34.3795 | 58.174 | 7.272 | 14284.0166 |
|
68 |
+
| 3000 | 0.1212 | 700.4797 | 4480.8540 | 21.8288 | 34.371 | 58.189 | 7.274 | 7045.9990 |
|
69 |
+
| 4000 | 0.1616 | 615.9059 | 3635.8176 | 21.5565 | 34.4355 | 58.08 | 7.26 | 3316.0488 |
|
70 |
+
| 5000 | 0.2020 | 556.0313 | 3492.5959 | 21.4455 | 34.3262 | 58.265 | 7.283 | 4788.7505 |
|
71 |
+
| 6000 | 0.2424 | 528.5394 | 3328.1577 | 21.2810 | 34.3681 | 58.193 | 7.274 | 3058.2744 |
|
72 |
+
| 7000 | 0.2828 | 479.2375 | 2988.6665 | 21.2197 | 34.3863 | 58.163 | 7.27 | 3689.9192 |
|
73 |
+
| 8000 | 0.3232 | 448.9053 | 2847.9541 | 21.0785 | 34.5149 | 57.946 | 7.243 | 1743.5521 |
|
74 |
+
| 9000 | 0.3636 | 433.0859 | 2823.5620 | 21.1035 | 34.4485 | 58.058 | 7.257 | 4932.8379 |
|
75 |
+
| 10000 | 0.4040 | 423.8369 | 2843.9414 | 21.0105 | 34.4298 | 58.089 | 7.261 | 3959.4795 |
|
76 |
+
| 11000 | 0.4444 | 394.3074 | 2524.8374 | 20.9575 | 34.5178 | 57.941 | 7.243 | 6243.0879 |
|
77 |
+
| 12000 | 0.4848 | 385.4673 | 2595.5920 | 20.9185 | 34.4535 | 58.049 | 7.256 | 17321.8613 |
|
78 |
+
| 13000 | 0.5253 | 369.9537 | 2477.9255 | 20.8475 | 34.4953 | 57.979 | 7.247 | 2443.6860 |
|
79 |
+
| 14000 | 0.5657 | 358.8618 | 2519.8567 | 20.7897 | 34.9016 | 57.304 | 7.163 | 3639.9983 |
|
80 |
+
| 15000 | 0.6061 | 343.0577 | 2395.4692 | 20.7710 | 34.3143 | 58.285 | 7.286 | 1816.2738 |
|
81 |
+
| 16000 | 0.6465 | 343.8312 | 2195.5515 | 20.7428 | 34.184 | 58.507 | 7.313 | 14709.8760 |
|
82 |
+
| 17000 | 0.6869 | 336.7496 | 2234.2798 | 20.7590 | 34.4691 | 58.023 | 7.253 | 6489.5991 |
|
83 |
+
| 18000 | 0.7273 | 338.3747 | 2191.5310 | 20.6583 | 34.4634 | 58.033 | 7.254 | 2819.0298 |
|
84 |
+
| 19000 | 0.7677 | 324.3280 | 2071.9238 | 20.6345 | 34.4307 | 58.088 | 7.261 | 3877.8486 |
|
85 |
+
| 20000 | 0.8081 | 315.1911 | 2056.7864 | 20.5710 | 34.2186 | 58.448 | 7.306 | 3151.9771 |
|
86 |
+
| 21000 | 0.8485 | 315.4604 | 2161.1489 | 20.5432 | 34.5086 | 57.957 | 7.245 | 3105.1853 |
|
87 |
+
| 22000 | 0.8889 | 324.6304 | 1950.2999 | 20.6125 | 34.2565 | 58.383 | 7.298 | 2055.8921 |
|
88 |
+
| 23000 | 0.9293 | 313.9452 | 1958.0153 | 20.5900 | 34.5413 | 57.902 | 7.238 | 4405.8896 |
|
89 |
+
| 24000 | 0.9697 | 311.3475 | 1918.9283 | 20.5405 | 34.2718 | 58.357 | 7.295 | 11800.9756 |
|
90 |
+
| 24750 | 1.0 | 303.2348 | 1956.3597 | 20.4700 | 34.3296 | 58.259 | 7.282 | 15104.0020 |
|
91 |
|
92 |
### Framework versions
|
93 |
- Distily 0.2.0
|
logs/distillation_objective=MultiObjective(logits_weight_1__logits_loss_fn_(fn_kl_divergence_loss())__activations_weight_0.1__activations_loss_fn_(fn_mse_loss())__attentions_weight_0__attentions_loss_fn_(f/events.out.tfevents.1723454082.93d6cbb3ad53
ADDED
@@ -0,0 +1,3 @@
|
|
|
|
|
|
|
|
|
1 |
+
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:4959a9e5649fc069aa3e5f9ccef01630bd6f0527a058567ebd3aaf8c73560990
|
3 |
+
size 253
|