cdong commited on
Commit
1a19bcf
·
1 Parent(s): defd09e

End of training

Browse files
README.md ADDED
@@ -0,0 +1,159 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ license: mit
3
+ base_model: camembert-base
4
+ tags:
5
+ - generated_from_trainer
6
+ metrics:
7
+ - accuracy
8
+ model-index:
9
+ - name: my_awesome_model
10
+ results: []
11
+ ---
12
+
13
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
14
+ should probably proofread and complete it, then remove this comment. -->
15
+
16
+ # my_awesome_model
17
+
18
+ This model is a fine-tuned version of [camembert-base](https://huggingface.co/camembert-base) on the None dataset.
19
+ It achieves the following results on the evaluation set:
20
+ - Loss: 0.1882
21
+ - Accuracy: 1.0
22
+
23
+ ## Model description
24
+
25
+ More information needed
26
+
27
+ ## Intended uses & limitations
28
+
29
+ More information needed
30
+
31
+ ## Training and evaluation data
32
+
33
+ More information needed
34
+
35
+ ## Training procedure
36
+
37
+ ### Training hyperparameters
38
+
39
+ The following hyperparameters were used during training:
40
+ - learning_rate: 2e-05
41
+ - train_batch_size: 16
42
+ - eval_batch_size: 16
43
+ - seed: 42
44
+ - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
+ - lr_scheduler_type: linear
46
+ - num_epochs: 100
47
+
48
+ ### Training results
49
+
50
+ | Training Loss | Epoch | Step | Validation Loss | Accuracy |
51
+ |:-------------:|:-----:|:----:|:---------------:|:--------:|
52
+ | No log | 1.0 | 1 | 0.6856 | 0.5 |
53
+ | No log | 2.0 | 2 | 0.6825 | 0.5 |
54
+ | No log | 3.0 | 3 | 0.6796 | 0.5 |
55
+ | No log | 4.0 | 4 | 0.6775 | 0.5 |
56
+ | No log | 5.0 | 5 | 0.6750 | 0.5 |
57
+ | No log | 6.0 | 6 | 0.6718 | 0.5 |
58
+ | No log | 7.0 | 7 | 0.6680 | 0.5 |
59
+ | No log | 8.0 | 8 | 0.6613 | 0.5 |
60
+ | No log | 9.0 | 9 | 0.6675 | 0.5 |
61
+ | No log | 10.0 | 10 | 0.6638 | 0.5 |
62
+ | No log | 11.0 | 11 | 0.6603 | 0.5 |
63
+ | No log | 12.0 | 12 | 0.6568 | 0.5 |
64
+ | No log | 13.0 | 13 | 0.6528 | 0.5 |
65
+ | No log | 14.0 | 14 | 0.6459 | 0.5 |
66
+ | No log | 15.0 | 15 | 0.6389 | 0.5 |
67
+ | No log | 16.0 | 16 | 0.6246 | 0.5 |
68
+ | No log | 17.0 | 17 | 0.6152 | 0.5 |
69
+ | No log | 18.0 | 18 | 0.6050 | 0.5 |
70
+ | No log | 19.0 | 19 | 0.5939 | 0.5 |
71
+ | No log | 20.0 | 20 | 0.5820 | 0.5 |
72
+ | No log | 21.0 | 21 | 0.5707 | 0.5 |
73
+ | No log | 22.0 | 22 | 0.5604 | 0.5 |
74
+ | No log | 23.0 | 23 | 0.5504 | 0.5 |
75
+ | No log | 24.0 | 24 | 0.5376 | 0.5 |
76
+ | No log | 25.0 | 25 | 0.5233 | 1.0 |
77
+ | No log | 26.0 | 26 | 0.5108 | 1.0 |
78
+ | No log | 27.0 | 27 | 0.4983 | 1.0 |
79
+ | No log | 28.0 | 28 | 0.4864 | 1.0 |
80
+ | No log | 29.0 | 29 | 0.4744 | 1.0 |
81
+ | No log | 30.0 | 30 | 0.4632 | 1.0 |
82
+ | No log | 31.0 | 31 | 0.4523 | 1.0 |
83
+ | No log | 32.0 | 32 | 0.4423 | 1.0 |
84
+ | No log | 33.0 | 33 | 0.4331 | 1.0 |
85
+ | No log | 34.0 | 34 | 0.4246 | 1.0 |
86
+ | No log | 35.0 | 35 | 0.4168 | 1.0 |
87
+ | No log | 36.0 | 36 | 0.4089 | 1.0 |
88
+ | No log | 37.0 | 37 | 0.4007 | 1.0 |
89
+ | No log | 38.0 | 38 | 0.3936 | 1.0 |
90
+ | No log | 39.0 | 39 | 0.3873 | 1.0 |
91
+ | No log | 40.0 | 40 | 0.3795 | 1.0 |
92
+ | No log | 41.0 | 41 | 0.3698 | 1.0 |
93
+ | No log | 42.0 | 42 | 0.3599 | 1.0 |
94
+ | No log | 43.0 | 43 | 0.3509 | 1.0 |
95
+ | No log | 44.0 | 44 | 0.3430 | 1.0 |
96
+ | No log | 45.0 | 45 | 0.3359 | 1.0 |
97
+ | No log | 46.0 | 46 | 0.3289 | 1.0 |
98
+ | No log | 47.0 | 47 | 0.3204 | 1.0 |
99
+ | No log | 48.0 | 48 | 0.3130 | 1.0 |
100
+ | No log | 49.0 | 49 | 0.3065 | 1.0 |
101
+ | No log | 50.0 | 50 | 0.2998 | 1.0 |
102
+ | No log | 51.0 | 51 | 0.2943 | 1.0 |
103
+ | No log | 52.0 | 52 | 0.2889 | 1.0 |
104
+ | No log | 53.0 | 53 | 0.2832 | 1.0 |
105
+ | No log | 54.0 | 54 | 0.2783 | 1.0 |
106
+ | No log | 55.0 | 55 | 0.2733 | 1.0 |
107
+ | No log | 56.0 | 56 | 0.2693 | 1.0 |
108
+ | No log | 57.0 | 57 | 0.2658 | 1.0 |
109
+ | No log | 58.0 | 58 | 0.2625 | 1.0 |
110
+ | No log | 59.0 | 59 | 0.2591 | 1.0 |
111
+ | No log | 60.0 | 60 | 0.2562 | 1.0 |
112
+ | No log | 61.0 | 61 | 0.2531 | 1.0 |
113
+ | No log | 62.0 | 62 | 0.2497 | 1.0 |
114
+ | No log | 63.0 | 63 | 0.2460 | 1.0 |
115
+ | No log | 64.0 | 64 | 0.2424 | 1.0 |
116
+ | No log | 65.0 | 65 | 0.2389 | 1.0 |
117
+ | No log | 66.0 | 66 | 0.2356 | 1.0 |
118
+ | No log | 67.0 | 67 | 0.2329 | 1.0 |
119
+ | No log | 68.0 | 68 | 0.2300 | 1.0 |
120
+ | No log | 69.0 | 69 | 0.2269 | 1.0 |
121
+ | No log | 70.0 | 70 | 0.2243 | 1.0 |
122
+ | No log | 71.0 | 71 | 0.2212 | 1.0 |
123
+ | No log | 72.0 | 72 | 0.2186 | 1.0 |
124
+ | No log | 73.0 | 73 | 0.2158 | 1.0 |
125
+ | No log | 74.0 | 74 | 0.2129 | 1.0 |
126
+ | No log | 75.0 | 75 | 0.2104 | 1.0 |
127
+ | No log | 76.0 | 76 | 0.2082 | 1.0 |
128
+ | No log | 77.0 | 77 | 0.2061 | 1.0 |
129
+ | No log | 78.0 | 78 | 0.2043 | 1.0 |
130
+ | No log | 79.0 | 79 | 0.2029 | 1.0 |
131
+ | No log | 80.0 | 80 | 0.2017 | 1.0 |
132
+ | No log | 81.0 | 81 | 0.2005 | 1.0 |
133
+ | No log | 82.0 | 82 | 0.1994 | 1.0 |
134
+ | No log | 83.0 | 83 | 0.1981 | 1.0 |
135
+ | No log | 84.0 | 84 | 0.1969 | 1.0 |
136
+ | No log | 85.0 | 85 | 0.1959 | 1.0 |
137
+ | No log | 86.0 | 86 | 0.1951 | 1.0 |
138
+ | No log | 87.0 | 87 | 0.1943 | 1.0 |
139
+ | No log | 88.0 | 88 | 0.1935 | 1.0 |
140
+ | No log | 89.0 | 89 | 0.1928 | 1.0 |
141
+ | No log | 90.0 | 90 | 0.1920 | 1.0 |
142
+ | No log | 91.0 | 91 | 0.1913 | 1.0 |
143
+ | No log | 92.0 | 92 | 0.1908 | 1.0 |
144
+ | No log | 93.0 | 93 | 0.1904 | 1.0 |
145
+ | No log | 94.0 | 94 | 0.1900 | 1.0 |
146
+ | No log | 95.0 | 95 | 0.1894 | 1.0 |
147
+ | No log | 96.0 | 96 | 0.1890 | 1.0 |
148
+ | No log | 97.0 | 97 | 0.1887 | 1.0 |
149
+ | No log | 98.0 | 98 | 0.1884 | 1.0 |
150
+ | No log | 99.0 | 99 | 0.1883 | 1.0 |
151
+ | No log | 100.0 | 100 | 0.1882 | 1.0 |
152
+
153
+
154
+ ### Framework versions
155
+
156
+ - Transformers 4.32.1
157
+ - Pytorch 2.1.1
158
+ - Datasets 2.14.7
159
+ - Tokenizers 0.13.2
config.json ADDED
@@ -0,0 +1,37 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "_name_or_path": "camembert-base",
3
+ "architectures": [
4
+ "CamembertForSequenceClassification"
5
+ ],
6
+ "attention_probs_dropout_prob": 0.1,
7
+ "bos_token_id": 5,
8
+ "classifier_dropout": null,
9
+ "eos_token_id": 6,
10
+ "hidden_act": "gelu",
11
+ "hidden_dropout_prob": 0.1,
12
+ "hidden_size": 768,
13
+ "id2label": {
14
+ "0": "NEGATIVE",
15
+ "1": "POSITIVE"
16
+ },
17
+ "initializer_range": 0.02,
18
+ "intermediate_size": 3072,
19
+ "label2id": {
20
+ "NEGATIVE": 0,
21
+ "POSITIVE": 1
22
+ },
23
+ "layer_norm_eps": 1e-05,
24
+ "max_position_embeddings": 514,
25
+ "model_type": "camembert",
26
+ "num_attention_heads": 12,
27
+ "num_hidden_layers": 12,
28
+ "output_past": true,
29
+ "pad_token_id": 1,
30
+ "position_embedding_type": "absolute",
31
+ "problem_type": "single_label_classification",
32
+ "torch_dtype": "float32",
33
+ "transformers_version": "4.32.1",
34
+ "type_vocab_size": 1,
35
+ "use_cache": true,
36
+ "vocab_size": 32005
37
+ }
pytorch_model.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:0a4cdf4cdc57d85a1408fe7c215b399cc0ef37d843c404f783cb5547184ccdc1
3
+ size 442560558
sentencepiece.bpe.model ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:988bc5a00281c6d210a5d34bd143d0363741a432fefe741bf71e61b1869d4314
3
+ size 810912
special_tokens_map.json ADDED
@@ -0,0 +1,19 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<s>NOTUSED",
4
+ "</s>NOTUSED"
5
+ ],
6
+ "bos_token": "<s>",
7
+ "cls_token": "<s>",
8
+ "eos_token": "</s>",
9
+ "mask_token": {
10
+ "content": "<mask>",
11
+ "lstrip": true,
12
+ "normalized": false,
13
+ "rstrip": false,
14
+ "single_word": false
15
+ },
16
+ "pad_token": "<pad>",
17
+ "sep_token": "</s>",
18
+ "unk_token": "<unk>"
19
+ }
tokenizer.json ADDED
The diff for this file is too large to render. See raw diff
 
tokenizer_config.json ADDED
@@ -0,0 +1,23 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ {
2
+ "additional_special_tokens": [
3
+ "<s>NOTUSED",
4
+ "</s>NOTUSED"
5
+ ],
6
+ "bos_token": "<s>",
7
+ "clean_up_tokenization_spaces": true,
8
+ "cls_token": "<s>",
9
+ "eos_token": "</s>",
10
+ "mask_token": {
11
+ "__type": "AddedToken",
12
+ "content": "<mask>",
13
+ "lstrip": true,
14
+ "normalized": true,
15
+ "rstrip": false,
16
+ "single_word": false
17
+ },
18
+ "model_max_length": 512,
19
+ "pad_token": "<pad>",
20
+ "sep_token": "</s>",
21
+ "tokenizer_class": "CamembertTokenizer",
22
+ "unk_token": "<unk>"
23
+ }
training_args.bin ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:340cde068031bb96fd3dcac61ea98bf75ec51b5c54cd3029555eda2118348a80
3
+ size 4472