shivanandmn commited on
Commit
c35ba4f
·
verified ·
1 Parent(s): 2bdd010

Model save

Browse files
Files changed (2) hide show
  1. README.md +99 -0
  2. generation_config.json +7 -0
README.md ADDED
@@ -0,0 +1,99 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ tags:
4
+ - generated_from_trainer
5
+ metrics:
6
+ - accuracy
7
+ - bleu
8
+ model-index:
9
+ - name: parallel-gpt2-medium-wikitext
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # parallel-gpt2-medium-wikitext
17
+
18
+ This model is a fine-tuned version of [](https://huggingface.co/) on an unknown dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 3.1010
21
+ - Accuracy: 0.4274
22
+ - Perplexity: 22.2205
23
+ - Bleu: 0.1461
24
+
25
+ ## Model description
26
+
27
+ More information needed
28
+
29
+ ## Intended uses & limitations
30
+
31
+ More information needed
32
+
33
+ ## Training and evaluation data
34
+
35
+ More information needed
36
+
37
+ ## Training procedure
38
+
39
+ ### Training hyperparameters
40
+
41
+ The following hyperparameters were used during training:
42
+ - learning_rate: 0.0001
43
+ - train_batch_size: 16
44
+ - eval_batch_size: 16
45
+ - seed: 42
46
+ - gradient_accumulation_steps: 2
47
+ - total_train_batch_size: 32
48
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
49
+ - lr_scheduler_type: linear
50
+ - lr_scheduler_warmup_ratio: 0.1
51
+ - num_epochs: 5
52
+
53
+ ### Training results
54
+
55
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy | Perplexity | Bleu |
56
+ |:-------------:|:------:|:-----:|:---------------:|:--------:|:----------:|:------:|
57
+ | 6.4455 | 0.1404 | 500 | 6.3313 | 0.1766 | 561.8647 | 0.0257 |
58
+ | 5.7254 | 0.2807 | 1000 | 5.6235 | 0.2136 | 276.8543 | 0.0454 |
59
+ | 5.1084 | 0.4211 | 1500 | 4.9822 | 0.2576 | 145.7898 | 0.0649 |
60
+ | 4.5994 | 0.5614 | 2000 | 4.5052 | 0.2929 | 90.4901 | 0.0741 |
61
+ | 4.2338 | 0.7018 | 2500 | 4.1378 | 0.3273 | 62.6674 | 0.0937 |
62
+ | 3.9975 | 0.8421 | 3000 | 3.9286 | 0.3465 | 50.8364 | 0.1031 |
63
+ | 3.8648 | 0.9825 | 3500 | 3.7926 | 0.3583 | 44.3697 | 0.1166 |
64
+ | 3.7164 | 1.1227 | 4000 | 3.6987 | 0.3667 | 40.3929 | 0.1226 |
65
+ | 3.6639 | 1.2630 | 4500 | 3.6221 | 0.3734 | 37.4157 | 0.1282 |
66
+ | 3.582 | 1.4034 | 5000 | 3.5575 | 0.3796 | 35.0763 | 0.1277 |
67
+ | 3.5315 | 1.5437 | 5500 | 3.5064 | 0.3840 | 33.3276 | 0.1312 |
68
+ | 3.5025 | 1.6841 | 6000 | 3.4594 | 0.3881 | 31.7989 | 0.1366 |
69
+ | 3.4462 | 1.8244 | 6500 | 3.4208 | 0.3919 | 30.5952 | 0.1310 |
70
+ | 3.4167 | 1.9648 | 7000 | 3.3863 | 0.3956 | 29.5564 | 0.1355 |
71
+ | 3.2967 | 2.1050 | 7500 | 3.3548 | 0.3989 | 28.6395 | 0.1317 |
72
+ | 3.2909 | 2.2453 | 8000 | 3.3290 | 0.4015 | 27.9115 | 0.1381 |
73
+ | 3.2593 | 2.3857 | 8500 | 3.3044 | 0.4039 | 27.2323 | 0.1422 |
74
+ | 3.2408 | 2.5260 | 9000 | 3.2826 | 0.4061 | 26.6448 | 0.1412 |
75
+ | 3.2278 | 2.6664 | 9500 | 3.2592 | 0.4090 | 26.0285 | 0.1436 |
76
+ | 3.2172 | 2.8067 | 10000 | 3.2415 | 0.4105 | 25.5733 | 0.1412 |
77
+ | 3.2145 | 2.9471 | 10500 | 3.2227 | 0.4125 | 25.0946 | 0.1402 |
78
+ | 3.0749 | 3.0873 | 11000 | 3.2099 | 0.4143 | 24.7768 | 0.1413 |
79
+ | 3.0777 | 3.2276 | 11500 | 3.1978 | 0.4160 | 24.4784 | 0.1420 |
80
+ | 3.0743 | 3.368 | 12000 | 3.1855 | 0.4174 | 24.1797 | 0.1438 |
81
+ | 3.0679 | 3.5084 | 12500 | 3.1735 | 0.4183 | 23.8912 | 0.1397 |
82
+ | 3.0635 | 3.6487 | 13000 | 3.1599 | 0.4200 | 23.5691 | 0.1423 |
83
+ | 3.0262 | 3.7891 | 13500 | 3.1489 | 0.4211 | 23.3095 | 0.1432 |
84
+ | 3.0382 | 3.9294 | 14000 | 3.1397 | 0.4223 | 23.0970 | 0.1461 |
85
+ | 2.9525 | 4.0696 | 14500 | 3.1335 | 0.4233 | 22.9539 | 0.1457 |
86
+ | 2.9621 | 4.2100 | 15000 | 3.1270 | 0.4239 | 22.8057 | 0.1454 |
87
+ | 2.9422 | 4.3503 | 15500 | 3.1211 | 0.4250 | 22.6718 | 0.1468 |
88
+ | 2.9224 | 4.4907 | 16000 | 3.1149 | 0.4257 | 22.5322 | 0.1454 |
89
+ | 2.9475 | 4.6310 | 16500 | 3.1084 | 0.4264 | 22.3862 | 0.1497 |
90
+ | 2.9318 | 4.7714 | 17000 | 3.1041 | 0.4270 | 22.2899 | 0.1468 |
91
+ | 2.9268 | 4.9117 | 17500 | 3.1010 | 0.4274 | 22.2205 | 0.1461 |
92
+
93
+
94
+ ### Framework versions
95
+
96
+ - Transformers 4.49.0
97
+ - Pytorch 2.6.0+cu124
98
+ - Datasets 3.3.2
99
+ - Tokenizers 0.21.0
generation_config.json ADDED
@@ -0,0 +1,7 @@
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 50256,
4
+ "eos_token_id": 50256,
5
+ "transformers_version": "4.49.0",
6
+ "use_cache": false
7
+ }