alakxender commited on
Commit
7089ca3
·
verified ·
1 Parent(s): 5636d13

Training in progress, step 100

Browse files
README.md ADDED
@@ -0,0 +1,139 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: cc-by-4.0
4
+ base_model: facebook/nougat-small
5
+ tags:
6
+ - generated_from_trainer
7
+ model-index:
8
+ - name: dhivehi-nougat-small-dv01-01
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # dhivehi-nougat-small-dv01-01
16
+
17
+ This model is a fine-tuned version of [facebook/nougat-small](https://huggingface.co/facebook/nougat-small) on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 0.0300
20
+
21
+ ## Model description
22
+
23
+ More information needed
24
+
25
+ ## Intended uses & limitations
26
+
27
+ More information needed
28
+
29
+ ## Training and evaluation data
30
+
31
+ More information needed
32
+
33
+ ## Training procedure
34
+
35
+ ### Training hyperparameters
36
+
37
+ The following hyperparameters were used during training:
38
+ - learning_rate: 0.0001
39
+ - train_batch_size: 3
40
+ - eval_batch_size: 3
41
+ - seed: 42
42
+ - gradient_accumulation_steps: 6
43
+ - total_train_batch_size: 18
44
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss |
51
+ |:-------------:|:------:|:----:|:---------------:|
52
+ | 7.1462 | 0.0567 | 100 | 1.1326 |
53
+ | 6.5572 | 0.1135 | 200 | 1.0543 |
54
+ | 6.1831 | 0.1702 | 300 | 0.9868 |
55
+ | 6.0022 | 0.2269 | 400 | 0.9323 |
56
+ | 5.6527 | 0.2837 | 500 | 0.8896 |
57
+ | 5.5004 | 0.3404 | 600 | 0.8478 |
58
+ | 5.2741 | 0.3971 | 700 | 0.8168 |
59
+ | 4.9927 | 0.4539 | 800 | 0.7466 |
60
+ | 4.3776 | 0.5106 | 900 | 0.6724 |
61
+ | 2.816 | 0.5673 | 1000 | 0.4038 |
62
+ | 1.8526 | 0.6241 | 1100 | 0.2720 |
63
+ | 1.5099 | 0.6808 | 1200 | 0.2064 |
64
+ | 1.3084 | 0.7375 | 1300 | 0.1696 |
65
+ | 1.1449 | 0.7943 | 1400 | 0.1516 |
66
+ | 0.8819 | 0.8510 | 1500 | 0.1331 |
67
+ | 0.7947 | 0.9077 | 1600 | 0.1194 |
68
+ | 0.9857 | 0.9644 | 1700 | 0.1091 |
69
+ | 0.7097 | 1.0210 | 1800 | 0.1023 |
70
+ | 0.5212 | 1.0777 | 1900 | 0.0953 |
71
+ | 0.6396 | 1.1345 | 2000 | 0.0882 |
72
+ | 0.6073 | 1.1912 | 2100 | 0.0863 |
73
+ | 0.5683 | 1.2479 | 2200 | 0.0815 |
74
+ | 0.5399 | 1.3047 | 2300 | 0.0770 |
75
+ | 0.5433 | 1.3614 | 2400 | 0.0740 |
76
+ | 0.5824 | 1.4181 | 2500 | 0.0688 |
77
+ | 0.447 | 1.4748 | 2600 | 0.0665 |
78
+ | 0.4875 | 1.5316 | 2700 | 0.0633 |
79
+ | 0.4694 | 1.5883 | 2800 | 0.0616 |
80
+ | 0.4001 | 1.6450 | 2900 | 0.0580 |
81
+ | 0.3971 | 1.7018 | 3000 | 0.0585 |
82
+ | 0.3889 | 1.7585 | 3100 | 0.0556 |
83
+ | 0.3088 | 1.8152 | 3200 | 0.0546 |
84
+ | 0.3476 | 1.8720 | 3300 | 0.0522 |
85
+ | 0.4569 | 1.9287 | 3400 | 0.0513 |
86
+ | 0.3979 | 1.9854 | 3500 | 0.0502 |
87
+ | 0.2847 | 2.0420 | 3600 | 0.0486 |
88
+ | 0.4332 | 2.0987 | 3700 | 0.0465 |
89
+ | 0.3647 | 2.1554 | 3800 | 0.0469 |
90
+ | 0.3791 | 2.2122 | 3900 | 0.0459 |
91
+ | 0.2982 | 2.2689 | 4000 | 0.0450 |
92
+ | 0.3294 | 2.3256 | 4100 | 0.0447 |
93
+ | 0.2839 | 2.3824 | 4200 | 0.0434 |
94
+ | 0.3094 | 2.4391 | 4300 | 0.0433 |
95
+ | 0.3062 | 2.4958 | 4400 | 0.0422 |
96
+ | 0.2723 | 2.5526 | 4500 | 0.0412 |
97
+ | 0.2348 | 2.6093 | 4600 | 0.0406 |
98
+ | 0.2125 | 2.6660 | 4700 | 0.0403 |
99
+ | 0.3172 | 2.7228 | 4800 | 0.0385 |
100
+ | 0.2315 | 2.7795 | 4900 | 0.0382 |
101
+ | 0.2707 | 2.8362 | 5000 | 0.0385 |
102
+ | 0.2391 | 2.8930 | 5100 | 0.0373 |
103
+ | 0.2979 | 2.9497 | 5200 | 0.0372 |
104
+ | 0.2933 | 3.0062 | 5300 | 0.0362 |
105
+ | 0.2388 | 3.0630 | 5400 | 0.0357 |
106
+ | 0.2525 | 3.1197 | 5500 | 0.0364 |
107
+ | 0.2563 | 3.1764 | 5600 | 0.0359 |
108
+ | 0.2534 | 3.2332 | 5700 | 0.0354 |
109
+ | 0.2401 | 3.2899 | 5800 | 0.0344 |
110
+ | 0.2116 | 3.3466 | 5900 | 0.0340 |
111
+ | 0.2713 | 3.4034 | 6000 | 0.0340 |
112
+ | 0.2351 | 3.4601 | 6100 | 0.0333 |
113
+ | 0.1471 | 3.5168 | 6200 | 0.0335 |
114
+ | 0.2209 | 3.5736 | 6300 | 0.0326 |
115
+ | 0.2206 | 3.6303 | 6400 | 0.0324 |
116
+ | 0.2208 | 3.6870 | 6500 | 0.0316 |
117
+ | 0.2329 | 3.7438 | 6600 | 0.0316 |
118
+ | 0.1439 | 3.8005 | 6700 | 0.0312 |
119
+ | 0.2335 | 3.8572 | 6800 | 0.0315 |
120
+ | 0.1582 | 3.9140 | 6900 | 0.0312 |
121
+ | 0.2298 | 3.9707 | 7000 | 0.0305 |
122
+ | 0.1649 | 4.0272 | 7100 | 0.0309 |
123
+ | 0.1489 | 4.0840 | 7200 | 0.0304 |
124
+ | 0.1729 | 4.1407 | 7300 | 0.0304 |
125
+ | 0.1907 | 4.1974 | 7400 | 0.0297 |
126
+ | 0.2 | 4.2542 | 7500 | 0.0298 |
127
+ | 0.1776 | 4.3109 | 7600 | 0.0296 |
128
+ | 0.1955 | 4.3676 | 7700 | 0.0292 |
129
+ | 0.1838 | 4.4244 | 7800 | 0.0295 |
130
+ | 0.1685 | 4.4811 | 7900 | 0.0292 |
131
+ | 0.161 | 4.5378 | 8000 | 0.0300 |
132
+
133
+
134
+ ### Framework versions
135
+
136
+ - Transformers 4.47.0
137
+ - Pytorch 2.6.0+cu124
138
+ - Datasets 3.2.0
139
+ - Tokenizers 0.21.0
config.json ADDED
@@ -0,0 +1,193 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "facebook/nougat-base",
3
+ "architectures": [
4
+ "VisionEncoderDecoderModel"
5
+ ],
6
+ "decoder": {
7
+ "_attn_implementation_autoset": false,
8
+ "_name_or_path": "",
9
+ "activation_dropout": 0.0,
10
+ "activation_function": "gelu",
11
+ "add_cross_attention": true,
12
+ "add_final_layer_norm": true,
13
+ "architectures": null,
14
+ "attention_dropout": 0.0,
15
+ "bad_words_ids": null,
16
+ "begin_suppress_tokens": null,
17
+ "bos_token_id": 0,
18
+ "chunk_size_feed_forward": 0,
19
+ "classifier_dropout": 0.0,
20
+ "cross_attention_hidden_size": null,
21
+ "d_model": 1024,
22
+ "decoder_attention_heads": 16,
23
+ "decoder_ffn_dim": 4096,
24
+ "decoder_layerdrop": 0.0,
25
+ "decoder_layers": 10,
26
+ "decoder_start_token_id": null,
27
+ "diversity_penalty": 0.0,
28
+ "do_sample": false,
29
+ "dropout": 0.1,
30
+ "early_stopping": false,
31
+ "encoder_attention_heads": 16,
32
+ "encoder_ffn_dim": 4096,
33
+ "encoder_layerdrop": 0.0,
34
+ "encoder_layers": 12,
35
+ "encoder_no_repeat_ngram_size": 0,
36
+ "eos_token_id": 2,
37
+ "exponential_decay_length_penalty": null,
38
+ "finetuning_task": null,
39
+ "forced_bos_token_id": null,
40
+ "forced_eos_token_id": 2,
41
+ "id2label": {
42
+ "0": "LABEL_0",
43
+ "1": "LABEL_1"
44
+ },
45
+ "init_std": 0.02,
46
+ "is_decoder": true,
47
+ "is_encoder_decoder": false,
48
+ "label2id": {
49
+ "LABEL_0": 0,
50
+ "LABEL_1": 1
51
+ },
52
+ "length_penalty": 1.0,
53
+ "max_length": 20,
54
+ "max_position_embeddings": 4096,
55
+ "min_length": 0,
56
+ "model_type": "mbart",
57
+ "no_repeat_ngram_size": 0,
58
+ "num_beam_groups": 1,
59
+ "num_beams": 1,
60
+ "num_hidden_layers": 12,
61
+ "num_return_sequences": 1,
62
+ "output_attentions": false,
63
+ "output_hidden_states": false,
64
+ "output_scores": false,
65
+ "pad_token_id": 1,
66
+ "prefix": null,
67
+ "problem_type": null,
68
+ "pruned_heads": {},
69
+ "remove_invalid_values": false,
70
+ "repetition_penalty": 1.0,
71
+ "return_dict": true,
72
+ "return_dict_in_generate": false,
73
+ "scale_embedding": true,
74
+ "sep_token_id": null,
75
+ "suppress_tokens": null,
76
+ "task_specific_params": null,
77
+ "temperature": 1.0,
78
+ "tf_legacy_loss": false,
79
+ "tie_encoder_decoder": false,
80
+ "tie_word_embeddings": false,
81
+ "tokenizer_class": null,
82
+ "top_k": 50,
83
+ "top_p": 1.0,
84
+ "torch_dtype": null,
85
+ "torchscript": false,
86
+ "typical_p": 1.0,
87
+ "use_bfloat16": false,
88
+ "use_cache": true,
89
+ "vocab_size": 50000
90
+ },
91
+ "decoder_start_token_id": 0,
92
+ "encoder": {
93
+ "_attn_implementation_autoset": false,
94
+ "_name_or_path": "",
95
+ "add_cross_attention": false,
96
+ "architectures": null,
97
+ "attention_probs_dropout_prob": 0.0,
98
+ "bad_words_ids": null,
99
+ "begin_suppress_tokens": null,
100
+ "bos_token_id": null,
101
+ "chunk_size_feed_forward": 0,
102
+ "cross_attention_hidden_size": null,
103
+ "decoder_start_token_id": null,
104
+ "depths": [
105
+ 2,
106
+ 2,
107
+ 14,
108
+ 2
109
+ ],
110
+ "diversity_penalty": 0.0,
111
+ "do_sample": false,
112
+ "drop_path_rate": 0.1,
113
+ "early_stopping": false,
114
+ "embed_dim": 128,
115
+ "encoder_no_repeat_ngram_size": 0,
116
+ "eos_token_id": null,
117
+ "exponential_decay_length_penalty": null,
118
+ "finetuning_task": null,
119
+ "forced_bos_token_id": null,
120
+ "forced_eos_token_id": null,
121
+ "hidden_act": "gelu",
122
+ "hidden_dropout_prob": 0.0,
123
+ "hidden_size": 1024,
124
+ "id2label": {
125
+ "0": "LABEL_0",
126
+ "1": "LABEL_1"
127
+ },
128
+ "image_size": [
129
+ 896,
130
+ 672
131
+ ],
132
+ "initializer_range": 0.02,
133
+ "is_decoder": false,
134
+ "is_encoder_decoder": false,
135
+ "label2id": {
136
+ "LABEL_0": 0,
137
+ "LABEL_1": 1
138
+ },
139
+ "layer_norm_eps": 1e-05,
140
+ "length_penalty": 1.0,
141
+ "max_length": 20,
142
+ "min_length": 0,
143
+ "mlp_ratio": 4.0,
144
+ "model_type": "donut-swin",
145
+ "no_repeat_ngram_size": 0,
146
+ "num_beam_groups": 1,
147
+ "num_beams": 1,
148
+ "num_channels": 3,
149
+ "num_heads": [
150
+ 4,
151
+ 8,
152
+ 16,
153
+ 32
154
+ ],
155
+ "num_layers": 4,
156
+ "num_return_sequences": 1,
157
+ "output_attentions": false,
158
+ "output_hidden_states": false,
159
+ "output_scores": false,
160
+ "pad_token_id": null,
161
+ "patch_size": 4,
162
+ "prefix": null,
163
+ "problem_type": null,
164
+ "pruned_heads": {},
165
+ "qkv_bias": true,
166
+ "remove_invalid_values": false,
167
+ "repetition_penalty": 1.0,
168
+ "return_dict": true,
169
+ "return_dict_in_generate": false,
170
+ "sep_token_id": null,
171
+ "suppress_tokens": null,
172
+ "task_specific_params": null,
173
+ "temperature": 1.0,
174
+ "tf_legacy_loss": false,
175
+ "tie_encoder_decoder": false,
176
+ "tie_word_embeddings": true,
177
+ "tokenizer_class": null,
178
+ "top_k": 50,
179
+ "top_p": 1.0,
180
+ "torch_dtype": null,
181
+ "torchscript": false,
182
+ "typical_p": 1.0,
183
+ "use_absolute_embeddings": false,
184
+ "use_bfloat16": false,
185
+ "window_size": 7
186
+ },
187
+ "is_encoder_decoder": true,
188
+ "model_type": "vision-encoder-decoder",
189
+ "pad_token_id": 1,
190
+ "tie_word_embeddings": false,
191
+ "torch_dtype": "bfloat16",
192
+ "transformers_version": "4.47.0"
193
+ }
generation_config.json ADDED
@@ -0,0 +1,8 @@
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_from_model_config": true,
3
+ "bos_token_id": 0,
4
+ "eos_token_id": 2,
5
+ "forced_eos_token_id": 2,
6
+ "pad_token_id": 1,
7
+ "transformers_version": "4.47.0"
8
+ }
model.safetensors ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:a3849a2405bfbb42310c026c2bbd98c02f4ff6da4ee06ccd72cbfd418d18c6ab
3
+ size 697843112
preprocessor_config.json ADDED
@@ -0,0 +1,27 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "do_align_long_axis": false,
3
+ "do_crop_margin": true,
4
+ "do_normalize": true,
5
+ "do_pad": true,
6
+ "do_rescale": true,
7
+ "do_resize": true,
8
+ "do_thumbnail": true,
9
+ "image_mean": [
10
+ 0.485,
11
+ 0.456,
12
+ 0.406
13
+ ],
14
+ "image_processor_type": "NougatImageProcessor",
15
+ "image_std": [
16
+ 0.229,
17
+ 0.224,
18
+ 0.225
19
+ ],
20
+ "processor_class": "NougatProcessor",
21
+ "resample": 2,
22
+ "rescale_factor": 0.00392156862745098,
23
+ "size": {
24
+ "height": 896,
25
+ "width": 672
26
+ }
27
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d55d96f2b7de51106c8daeac9e4f76567fa44336c14db287d5015285eb420b78
3
+ size 5624