File size: 10,393 Bytes
daa9fda
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
2021-05-31 00:20:49,022	INFO	__main__	Namespace(adjust_lr=False, config='torchdistill/configs/sample/glue/cola/kd/bert_base_uncased_from_bert_large_uncased.yaml', log='log/glue/cola/kd/bert_base_uncased_from_bert_large_uncased.txt', private_output='leaderboard/glue/kd/bert_base_uncased_from_bert_large_uncased/', seed=None, student_only=False, task_name='cola', test_only=False, world_size=1)
2021-05-31 00:20:49,055	INFO	__main__	Distributed environment: NO
Num processes: 1
Process index: 0
Local process index: 0
Device: cuda
Use FP16 precision: True

2021-05-31 00:20:57,823	WARNING	datasets.builder	Reusing dataset glue (/root/.cache/huggingface/datasets/glue/cola/1.0.0/dacbe3125aa31d7f70367a07a8a9e72a5a0bfeb5fc42e75c9db75b96da6053ad)
2021-05-31 00:20:58,582	INFO	__main__	Start training
2021-05-31 00:20:58,582	INFO	torchdistill.models.util	[teacher model]
2021-05-31 00:20:58,582	INFO	torchdistill.models.util	Using the original teacher model
2021-05-31 00:20:58,582	INFO	torchdistill.models.util	[student model]
2021-05-31 00:20:58,582	INFO	torchdistill.models.util	Using the original student model
2021-05-31 00:20:58,583	INFO	torchdistill.core.distillation	Loss = 1.0 * OrgLoss
2021-05-31 00:20:58,583	INFO	torchdistill.core.distillation	Freezing the whole teacher model
2021-05-31 00:21:01,585	INFO	torchdistill.misc.log	Epoch: [0]  [  0/535]  eta: 0:01:04  lr: 9.993769470404985e-05  sample/s: 34.32832986855674  loss: 0.2715 (0.2715)  time: 0.1200  data: 0.0034  max mem: 1758
2021-05-31 00:21:07,875	INFO	torchdistill.misc.log	Epoch: [0]  [ 50/535]  eta: 0:01:00  lr: 9.682242990654206e-05  sample/s: 40.77632733415159  loss: 0.1915 (0.2239)  time: 0.1222  data: 0.0017  max mem: 2766
2021-05-31 00:21:14,086	INFO	torchdistill.misc.log	Epoch: [0]  [100/535]  eta: 0:00:54  lr: 9.370716510903426e-05  sample/s: 40.664152406805954  loss: 0.1580 (0.2037)  time: 0.1245  data: 0.0017  max mem: 2833
2021-05-31 00:21:20,465	INFO	torchdistill.misc.log	Epoch: [0]  [150/535]  eta: 0:00:48  lr: 9.059190031152648e-05  sample/s: 27.442853614372105  loss: 0.2045 (0.1980)  time: 0.1318  data: 0.0018  max mem: 2833
2021-05-31 00:21:26,554	INFO	torchdistill.misc.log	Epoch: [0]  [200/535]  eta: 0:00:41  lr: 8.74766355140187e-05  sample/s: 32.08432808705131  loss: 0.1471 (0.1922)  time: 0.1235  data: 0.0016  max mem: 2833
2021-05-31 00:21:32,711	INFO	torchdistill.misc.log	Epoch: [0]  [250/535]  eta: 0:00:35  lr: 8.436137071651092e-05  sample/s: 32.09114018963311  loss: 0.1522 (0.1857)  time: 0.1219  data: 0.0017  max mem: 2833
2021-05-31 00:21:38,845	INFO	torchdistill.misc.log	Epoch: [0]  [300/535]  eta: 0:00:29  lr: 8.124610591900313e-05  sample/s: 32.025533052093074  loss: 0.1108 (0.1777)  time: 0.1217  data: 0.0017  max mem: 2833
2021-05-31 00:21:45,058	INFO	torchdistill.misc.log	Epoch: [0]  [350/535]  eta: 0:00:22  lr: 7.813084112149533e-05  sample/s: 32.00012207077816  loss: 0.1630 (0.1759)  time: 0.1264  data: 0.0017  max mem: 2836
2021-05-31 00:21:51,311	INFO	torchdistill.misc.log	Epoch: [0]  [400/535]  eta: 0:00:16  lr: 7.501557632398754e-05  sample/s: 27.491013960804878  loss: 0.1091 (0.1707)  time: 0.1256  data: 0.0017  max mem: 2836
2021-05-31 00:21:57,642	INFO	torchdistill.misc.log	Epoch: [0]  [450/535]  eta: 0:00:10  lr: 7.190031152647976e-05  sample/s: 32.02229335226741  loss: 0.1059 (0.1657)  time: 0.1231  data: 0.0016  max mem: 2836
2021-05-31 00:22:03,683	INFO	torchdistill.misc.log	Epoch: [0]  [500/535]  eta: 0:00:04  lr: 6.878504672897197e-05  sample/s: 40.68426872562904  loss: 0.1292 (0.1631)  time: 0.1142  data: 0.0017  max mem: 2915
2021-05-31 00:22:07,877	INFO	torchdistill.misc.log	Epoch: [0] Total time: 0:01:06
2021-05-31 00:22:08,938	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
2021-05-31 00:22:08,938	INFO	__main__	Validation: matthews_correlation = 0.5696227147853862
2021-05-31 00:22:08,938	INFO	__main__	Updating ckpt at ./resource/ckpt/glue/cola/kd/cola-bert-base-uncased_from_bert-large-uncased
2021-05-31 00:22:10,190	INFO	torchdistill.misc.log	Epoch: [1]  [  0/535]  eta: 0:00:55  lr: 6.660436137071651e-05  sample/s: 39.90689067075158  loss: 0.0587 (0.0587)  time: 0.1033  data: 0.0031  max mem: 2915
2021-05-31 00:22:16,375	INFO	torchdistill.misc.log	Epoch: [1]  [ 50/535]  eta: 0:00:59  lr: 6.348909657320873e-05  sample/s: 40.69028776542207  loss: 0.0609 (0.0709)  time: 0.1267  data: 0.0016  max mem: 2915
2021-05-31 00:22:22,636	INFO	torchdistill.misc.log	Epoch: [1]  [100/535]  eta: 0:00:54  lr: 6.037383177570094e-05  sample/s: 40.703219921200244  loss: 0.0721 (0.0677)  time: 0.1257  data: 0.0017  max mem: 2915
2021-05-31 00:22:28,858	INFO	torchdistill.misc.log	Epoch: [1]  [150/535]  eta: 0:00:47  lr: 5.7258566978193154e-05  sample/s: 23.86856736377863  loss: 0.0641 (0.0676)  time: 0.1254  data: 0.0017  max mem: 2915
2021-05-31 00:22:35,146	INFO	torchdistill.misc.log	Epoch: [1]  [200/535]  eta: 0:00:41  lr: 5.414330218068536e-05  sample/s: 32.09961332488936  loss: 0.0410 (0.0655)  time: 0.1278  data: 0.0017  max mem: 2915
2021-05-31 00:22:41,551	INFO	torchdistill.misc.log	Epoch: [1]  [250/535]  eta: 0:00:35  lr: 5.1028037383177574e-05  sample/s: 32.12475203541585  loss: 0.0638 (0.0681)  time: 0.1273  data: 0.0017  max mem: 2915
2021-05-31 00:22:47,871	INFO	torchdistill.misc.log	Epoch: [1]  [300/535]  eta: 0:00:29  lr: 4.791277258566979e-05  sample/s: 27.227165470614775  loss: 0.0764 (0.0700)  time: 0.1252  data: 0.0017  max mem: 2915
2021-05-31 00:22:54,042	INFO	torchdistill.misc.log	Epoch: [1]  [350/535]  eta: 0:00:23  lr: 4.4797507788161994e-05  sample/s: 32.06403167947466  loss: 0.0372 (0.0682)  time: 0.1282  data: 0.0017  max mem: 2915
2021-05-31 00:23:00,212	INFO	torchdistill.misc.log	Epoch: [1]  [400/535]  eta: 0:00:16  lr: 4.168224299065421e-05  sample/s: 32.087089401662  loss: 0.0755 (0.0690)  time: 0.1181  data: 0.0016  max mem: 2915
2021-05-31 00:23:06,665	INFO	torchdistill.misc.log	Epoch: [1]  [450/535]  eta: 0:00:10  lr: 3.856697819314642e-05  sample/s: 27.528814883122593  loss: 0.0612 (0.0692)  time: 0.1331  data: 0.0017  max mem: 2917
2021-05-31 00:23:13,008	INFO	torchdistill.misc.log	Epoch: [1]  [500/535]  eta: 0:00:04  lr: 3.545171339563863e-05  sample/s: 32.032503684921984  loss: 0.0505 (0.0688)  time: 0.1282  data: 0.0017  max mem: 2917
2021-05-31 00:23:17,244	INFO	torchdistill.misc.log	Epoch: [1] Total time: 0:01:07
2021-05-31 00:23:18,310	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
2021-05-31 00:23:18,310	INFO	__main__	Validation: matthews_correlation = 0.5915362000526524
2021-05-31 00:23:18,310	INFO	__main__	Updating ckpt at ./resource/ckpt/glue/cola/kd/cola-bert-base-uncased_from_bert-large-uncased
2021-05-31 00:23:19,657	INFO	torchdistill.misc.log	Epoch: [2]  [  0/535]  eta: 0:01:08  lr: 3.327102803738318e-05  sample/s: 31.84517412406494  loss: 0.0010 (0.0010)  time: 0.1285  data: 0.0029  max mem: 2917
2021-05-31 00:23:25,866	INFO	torchdistill.misc.log	Epoch: [2]  [ 50/535]  eta: 0:01:00  lr: 3.015576323987539e-05  sample/s: 40.70726907893619  loss: 0.0024 (0.0234)  time: 0.1229  data: 0.0017  max mem: 2917
2021-05-31 00:23:32,177	INFO	torchdistill.misc.log	Epoch: [2]  [100/535]  eta: 0:00:54  lr: 2.7040498442367603e-05  sample/s: 31.97889576352276  loss: 0.0047 (0.0244)  time: 0.1282  data: 0.0017  max mem: 2917
2021-05-31 00:23:38,580	INFO	torchdistill.misc.log	Epoch: [2]  [150/535]  eta: 0:00:48  lr: 2.3925233644859816e-05  sample/s: 32.0352560858947  loss: 0.0017 (0.0278)  time: 0.1293  data: 0.0017  max mem: 2917
2021-05-31 00:23:45,020	INFO	torchdistill.misc.log	Epoch: [2]  [200/535]  eta: 0:00:42  lr: 2.0809968847352026e-05  sample/s: 32.08377588115941  loss: 0.0033 (0.0264)  time: 0.1303  data: 0.0017  max mem: 2917
2021-05-31 00:23:51,367	INFO	torchdistill.misc.log	Epoch: [2]  [250/535]  eta: 0:00:36  lr: 1.769470404984424e-05  sample/s: 40.8114407758866  loss: 0.0048 (0.0263)  time: 0.1203  data: 0.0016  max mem: 2917
2021-05-31 00:23:57,611	INFO	torchdistill.misc.log	Epoch: [2]  [300/535]  eta: 0:00:29  lr: 1.457943925233645e-05  sample/s: 32.14740440404381  loss: 0.0006 (0.0274)  time: 0.1241  data: 0.0016  max mem: 2917
2021-05-31 00:24:03,995	INFO	torchdistill.misc.log	Epoch: [2]  [350/535]  eta: 0:00:23  lr: 1.1464174454828661e-05  sample/s: 32.08083110406584  loss: 0.0287 (0.0277)  time: 0.1269  data: 0.0017  max mem: 2917
2021-05-31 00:24:10,245	INFO	torchdistill.misc.log	Epoch: [2]  [400/535]  eta: 0:00:17  lr: 8.348909657320873e-06  sample/s: 40.49562511917779  loss: 0.0076 (0.0275)  time: 0.1253  data: 0.0016  max mem: 2917
2021-05-31 00:24:16,391	INFO	torchdistill.misc.log	Epoch: [2]  [450/535]  eta: 0:00:10  lr: 5.233644859813085e-06  sample/s: 32.0666056318915  loss: 0.0009 (0.0268)  time: 0.1182  data: 0.0016  max mem: 2917
2021-05-31 00:24:22,587	INFO	torchdistill.misc.log	Epoch: [2]  [500/535]  eta: 0:00:04  lr: 2.118380062305296e-06  sample/s: 32.093043185504854  loss: 0.0009 (0.0269)  time: 0.1207  data: 0.0018  max mem: 2917
2021-05-31 00:24:26,813	INFO	torchdistill.misc.log	Epoch: [2] Total time: 0:01:07
2021-05-31 00:24:27,875	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
2021-05-31 00:24:27,875	INFO	__main__	Validation: matthews_correlation = 0.6175138437153591
2021-05-31 00:24:27,876	INFO	__main__	Updating ckpt at ./resource/ckpt/glue/cola/kd/cola-bert-base-uncased_from_bert-large-uncased
2021-05-31 00:24:29,073	INFO	__main__	[Teacher: bert-large-uncased]
2021-05-31 00:24:31,822	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
2021-05-31 00:24:31,822	INFO	__main__	Test: matthews_correlation = 0.6335324951654004
2021-05-31 00:24:33,639	INFO	__main__	[Student: bert-base-uncased]
2021-05-31 00:24:34,706	INFO	/usr/local/lib/python3.7/dist-packages/datasets/metric.py	Removing /root/.cache/huggingface/metrics/glue/cola/default_experiment-1-0.arrow
2021-05-31 00:24:34,706	INFO	__main__	Test: matthews_correlation = 0.6175138437153591
2021-05-31 00:24:34,706	INFO	__main__	Start prediction for private dataset(s)
2021-05-31 00:24:34,707	INFO	__main__	cola/test: 1063 samples