End of training
Browse files
README.md
CHANGED
@@ -5,7 +5,7 @@ base_model: FacebookAI/xlm-roberta-large
|
|
5 |
tags:
|
6 |
- generated_from_trainer
|
7 |
datasets:
|
8 |
-
-
|
9 |
metrics:
|
10 |
- precision
|
11 |
- recall
|
@@ -21,13 +21,13 @@ should probably proofread and complete it, then remove this comment. -->
|
|
21 |
|
22 |
# roberta-large-ner-qlorafinetune-runs-colab
|
23 |
|
24 |
-
This model is a fine-tuned version of [FacebookAI/xlm-roberta-large](https://huggingface.co/FacebookAI/xlm-roberta-large) on the
|
25 |
It achieves the following results on the evaluation set:
|
26 |
-
- Loss: 0.
|
27 |
-
- Precision: 0.
|
28 |
-
- Recall: 0.
|
29 |
-
- F1: 0.
|
30 |
-
- Accuracy: 0.
|
31 |
|
32 |
## Model description
|
33 |
|
@@ -50,106 +50,105 @@ The following hyperparameters were used during training:
|
|
50 |
- train_batch_size: 32
|
51 |
- eval_batch_size: 32
|
52 |
- seed: 42
|
53 |
-
- optimizer: Use
|
54 |
- lr_scheduler_type: linear
|
55 |
- training_steps: 1820
|
56 |
-
- mixed_precision_training: Native AMP
|
57 |
|
58 |
### Training results
|
59 |
|
60 |
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|
61 |
|:-------------:|:------:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
|
62 |
-
| 1.
|
63 |
-
| 0.
|
64 |
-
| 0.
|
65 |
-
| 0.
|
66 |
-
| 0.
|
67 |
-
| 0.
|
68 |
-
| 0.
|
69 |
-
| 0.
|
70 |
-
| 0.
|
71 |
-
| 0.
|
72 |
-
| 0.
|
73 |
-
| 0.
|
74 |
-
| 0.
|
75 |
-
| 0.
|
76 |
-
| 0.
|
77 |
-
| 0.
|
78 |
-
| 0.
|
79 |
-
| 0.
|
80 |
-
| 0.
|
81 |
-
| 0.
|
82 |
-
| 0.
|
83 |
-
| 0.
|
84 |
-
| 0.
|
85 |
-
| 0.
|
86 |
-
| 0.
|
87 |
-
| 0.
|
88 |
-
| 0.
|
89 |
-
| 0.
|
90 |
-
| 0.
|
91 |
-
| 0.
|
92 |
-
| 0.
|
93 |
-
| 0.
|
94 |
-
| 0.
|
95 |
-
| 0.
|
96 |
-
| 0.
|
97 |
-
| 0.
|
98 |
-
| 0.
|
99 |
-
| 0.
|
100 |
-
| 0.
|
101 |
-
| 0.
|
102 |
-
| 0.
|
103 |
-
| 0.
|
104 |
-
| 0.
|
105 |
-
| 0.
|
106 |
-
| 0.
|
107 |
-
| 0.
|
108 |
-
| 0.
|
109 |
-
| 0.
|
110 |
-
| 0.
|
111 |
-
| 0.
|
112 |
-
| 0.
|
113 |
-
| 0.
|
114 |
-
| 0.
|
115 |
-
| 0.
|
116 |
-
| 0.
|
117 |
-
| 0.
|
118 |
-
| 0.
|
119 |
-
| 0.
|
120 |
-
| 0.
|
121 |
-
| 0.
|
122 |
-
| 0.
|
123 |
-
| 0.
|
124 |
-
| 0.
|
125 |
-
| 0.
|
126 |
-
| 0.
|
127 |
-
| 0.
|
128 |
-
| 0.
|
129 |
-
| 0.
|
130 |
-
| 0.
|
131 |
-
| 0.
|
132 |
-
| 0.
|
133 |
-
| 0.
|
134 |
-
| 0.
|
135 |
-
| 0.
|
136 |
-
| 0.
|
137 |
-
| 0.
|
138 |
-
| 0.
|
139 |
-
| 0.
|
140 |
-
| 0.
|
141 |
-
| 0.
|
142 |
-
| 0.
|
143 |
-
| 0.
|
144 |
-
| 0.
|
145 |
-
| 0.
|
146 |
-
| 0.
|
147 |
-
| 0.
|
148 |
-
| 0.
|
149 |
-
| 0.
|
150 |
-
| 0.
|
151 |
-
| 0.
|
152 |
-
| 0.
|
153 |
|
154 |
|
155 |
### Framework versions
|
|
|
5 |
tags:
|
6 |
- generated_from_trainer
|
7 |
datasets:
|
8 |
+
- biobert_json
|
9 |
metrics:
|
10 |
- precision
|
11 |
- recall
|
|
|
21 |
|
22 |
# roberta-large-ner-qlorafinetune-runs-colab
|
23 |
|
24 |
+
This model is a fine-tuned version of [FacebookAI/xlm-roberta-large](https://huggingface.co/FacebookAI/xlm-roberta-large) on the biobert_json dataset.
|
25 |
It achieves the following results on the evaluation set:
|
26 |
+
- Loss: 0.0732
|
27 |
+
- Precision: 0.9365
|
28 |
+
- Recall: 0.9562
|
29 |
+
- F1: 0.9462
|
30 |
+
- Accuracy: 0.9815
|
31 |
|
32 |
## Model description
|
33 |
|
|
|
50 |
- train_batch_size: 32
|
51 |
- eval_batch_size: 32
|
52 |
- seed: 42
|
53 |
+
- optimizer: Use adamw_hf with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
|
54 |
- lr_scheduler_type: linear
|
55 |
- training_steps: 1820
|
|
|
56 |
|
57 |
### Training results
|
58 |
|
59 |
| Training Loss | Epoch | Step | Validation Loss | Precision | Recall | F1 | Accuracy |
|
60 |
|:-------------:|:------:|:----:|:---------------:|:---------:|:------:|:------:|:--------:|
|
61 |
+
| 1.9454 | 0.0654 | 20 | 0.9974 | 0.3110 | 0.0618 | 0.1031 | 0.7469 |
|
62 |
+
| 0.7603 | 0.1307 | 40 | 0.4834 | 0.5661 | 0.7392 | 0.6412 | 0.8648 |
|
63 |
+
| 0.4087 | 0.1961 | 60 | 0.2348 | 0.8311 | 0.8242 | 0.8276 | 0.9379 |
|
64 |
+
| 0.2905 | 0.2614 | 80 | 0.1960 | 0.8021 | 0.8687 | 0.8341 | 0.9440 |
|
65 |
+
| 0.2262 | 0.3268 | 100 | 0.1468 | 0.8719 | 0.9001 | 0.8857 | 0.9597 |
|
66 |
+
| 0.2196 | 0.3922 | 120 | 0.1415 | 0.8444 | 0.9139 | 0.8778 | 0.9570 |
|
67 |
+
| 0.177 | 0.4575 | 140 | 0.1139 | 0.8889 | 0.9275 | 0.9077 | 0.9671 |
|
68 |
+
| 0.1525 | 0.5229 | 160 | 0.1190 | 0.8847 | 0.9352 | 0.9093 | 0.9665 |
|
69 |
+
| 0.1516 | 0.5882 | 180 | 0.1099 | 0.8887 | 0.9431 | 0.9151 | 0.9684 |
|
70 |
+
| 0.1281 | 0.6536 | 200 | 0.0891 | 0.9181 | 0.9417 | 0.9297 | 0.9745 |
|
71 |
+
| 0.1231 | 0.7190 | 220 | 0.0926 | 0.9200 | 0.9301 | 0.9250 | 0.9733 |
|
72 |
+
| 0.1239 | 0.7843 | 240 | 0.0956 | 0.9089 | 0.9509 | 0.9295 | 0.9733 |
|
73 |
+
| 0.1118 | 0.8497 | 260 | 0.0885 | 0.9135 | 0.9428 | 0.9279 | 0.9744 |
|
74 |
+
| 0.1119 | 0.9150 | 280 | 0.1002 | 0.9024 | 0.9430 | 0.9223 | 0.9711 |
|
75 |
+
| 0.1254 | 0.9804 | 300 | 0.0839 | 0.9209 | 0.9421 | 0.9314 | 0.9756 |
|
76 |
+
| 0.1053 | 1.0458 | 320 | 0.0827 | 0.9216 | 0.9458 | 0.9335 | 0.9761 |
|
77 |
+
| 0.0905 | 1.1111 | 340 | 0.1008 | 0.9057 | 0.9530 | 0.9287 | 0.9711 |
|
78 |
+
| 0.0955 | 1.1765 | 360 | 0.0784 | 0.9240 | 0.9477 | 0.9357 | 0.9771 |
|
79 |
+
| 0.0938 | 1.2418 | 380 | 0.0844 | 0.9288 | 0.9396 | 0.9341 | 0.9766 |
|
80 |
+
| 0.1074 | 1.3072 | 400 | 0.0818 | 0.9249 | 0.9425 | 0.9337 | 0.9771 |
|
81 |
+
| 0.1064 | 1.3725 | 420 | 0.0980 | 0.8976 | 0.9351 | 0.9160 | 0.9695 |
|
82 |
+
| 0.0913 | 1.4379 | 440 | 0.0815 | 0.9247 | 0.9366 | 0.9306 | 0.9768 |
|
83 |
+
| 0.089 | 1.5033 | 460 | 0.0789 | 0.9228 | 0.9463 | 0.9344 | 0.9757 |
|
84 |
+
| 0.1175 | 1.5686 | 480 | 0.0873 | 0.9210 | 0.9315 | 0.9262 | 0.9729 |
|
85 |
+
| 0.0906 | 1.6340 | 500 | 0.0926 | 0.9121 | 0.9423 | 0.9269 | 0.9736 |
|
86 |
+
| 0.0814 | 1.6993 | 520 | 0.0873 | 0.9153 | 0.9636 | 0.9388 | 0.9768 |
|
87 |
+
| 0.0806 | 1.7647 | 540 | 0.0757 | 0.9263 | 0.9495 | 0.9378 | 0.9789 |
|
88 |
+
| 0.0906 | 1.8301 | 560 | 0.0749 | 0.9244 | 0.9635 | 0.9436 | 0.9795 |
|
89 |
+
| 0.0858 | 1.8954 | 580 | 0.1098 | 0.9006 | 0.9561 | 0.9275 | 0.9691 |
|
90 |
+
| 0.092 | 1.9608 | 600 | 0.1023 | 0.9035 | 0.9561 | 0.9291 | 0.9710 |
|
91 |
+
| 0.0764 | 2.0261 | 620 | 0.0840 | 0.9195 | 0.9543 | 0.9366 | 0.9767 |
|
92 |
+
| 0.0655 | 2.0915 | 640 | 0.0762 | 0.9259 | 0.9542 | 0.9398 | 0.9777 |
|
93 |
+
| 0.0573 | 2.1569 | 660 | 0.0846 | 0.9112 | 0.9503 | 0.9303 | 0.9749 |
|
94 |
+
| 0.077 | 2.2222 | 680 | 0.0750 | 0.9300 | 0.9576 | 0.9436 | 0.9793 |
|
95 |
+
| 0.0712 | 2.2876 | 700 | 0.0830 | 0.9186 | 0.9575 | 0.9376 | 0.9776 |
|
96 |
+
| 0.0592 | 2.3529 | 720 | 0.0743 | 0.9338 | 0.9569 | 0.9452 | 0.9802 |
|
97 |
+
| 0.0638 | 2.4183 | 740 | 0.0725 | 0.9349 | 0.9469 | 0.9408 | 0.9789 |
|
98 |
+
| 0.0893 | 2.4837 | 760 | 0.0724 | 0.9295 | 0.9597 | 0.9443 | 0.9801 |
|
99 |
+
| 0.0672 | 2.5490 | 780 | 0.0729 | 0.9389 | 0.9616 | 0.9501 | 0.9818 |
|
100 |
+
| 0.0692 | 2.6144 | 800 | 0.0724 | 0.9427 | 0.9531 | 0.9479 | 0.9810 |
|
101 |
+
| 0.0667 | 2.6797 | 820 | 0.0757 | 0.9418 | 0.9531 | 0.9474 | 0.9802 |
|
102 |
+
| 0.071 | 2.7451 | 840 | 0.0777 | 0.9249 | 0.9577 | 0.9410 | 0.9791 |
|
103 |
+
| 0.0686 | 2.8105 | 860 | 0.0721 | 0.9393 | 0.9606 | 0.9498 | 0.9819 |
|
104 |
+
| 0.0668 | 2.8758 | 880 | 0.0767 | 0.9360 | 0.9558 | 0.9458 | 0.9788 |
|
105 |
+
| 0.0573 | 2.9412 | 900 | 0.0762 | 0.9283 | 0.9605 | 0.9441 | 0.9793 |
|
106 |
+
| 0.0593 | 3.0065 | 920 | 0.0681 | 0.9414 | 0.9595 | 0.9504 | 0.9823 |
|
107 |
+
| 0.0463 | 3.0719 | 940 | 0.0751 | 0.9319 | 0.9595 | 0.9455 | 0.9805 |
|
108 |
+
| 0.0501 | 3.1373 | 960 | 0.0904 | 0.9169 | 0.9524 | 0.9343 | 0.9758 |
|
109 |
+
| 0.0483 | 3.2026 | 980 | 0.0736 | 0.9366 | 0.9526 | 0.9445 | 0.9799 |
|
110 |
+
| 0.0535 | 3.2680 | 1000 | 0.0785 | 0.9285 | 0.9542 | 0.9411 | 0.9785 |
|
111 |
+
| 0.0526 | 3.3333 | 1020 | 0.0747 | 0.9365 | 0.9581 | 0.9472 | 0.9806 |
|
112 |
+
| 0.0534 | 3.3987 | 1040 | 0.0788 | 0.9255 | 0.9631 | 0.9439 | 0.9795 |
|
113 |
+
| 0.0615 | 3.4641 | 1060 | 0.0719 | 0.9304 | 0.9589 | 0.9445 | 0.9799 |
|
114 |
+
| 0.0485 | 3.5294 | 1080 | 0.0712 | 0.9327 | 0.9525 | 0.9425 | 0.9797 |
|
115 |
+
| 0.0484 | 3.5948 | 1100 | 0.0749 | 0.9329 | 0.9625 | 0.9475 | 0.9804 |
|
116 |
+
| 0.0452 | 3.6601 | 1120 | 0.0701 | 0.9378 | 0.9580 | 0.9478 | 0.9819 |
|
117 |
+
| 0.0622 | 3.7255 | 1140 | 0.0706 | 0.9412 | 0.9580 | 0.9495 | 0.9815 |
|
118 |
+
| 0.0491 | 3.7908 | 1160 | 0.0718 | 0.9363 | 0.9588 | 0.9474 | 0.9814 |
|
119 |
+
| 0.0601 | 3.8562 | 1180 | 0.0804 | 0.9331 | 0.9617 | 0.9472 | 0.9798 |
|
120 |
+
| 0.0592 | 3.9216 | 1200 | 0.0803 | 0.9353 | 0.9569 | 0.9460 | 0.9789 |
|
121 |
+
| 0.0596 | 3.9869 | 1220 | 0.0711 | 0.9344 | 0.9600 | 0.9470 | 0.9815 |
|
122 |
+
| 0.0416 | 4.0523 | 1240 | 0.0726 | 0.9362 | 0.9594 | 0.9477 | 0.9811 |
|
123 |
+
| 0.0357 | 4.1176 | 1260 | 0.0682 | 0.9381 | 0.9621 | 0.9499 | 0.9820 |
|
124 |
+
| 0.0416 | 4.1830 | 1280 | 0.0678 | 0.9381 | 0.9611 | 0.9495 | 0.9823 |
|
125 |
+
| 0.0444 | 4.2484 | 1300 | 0.0738 | 0.9340 | 0.9554 | 0.9446 | 0.9802 |
|
126 |
+
| 0.0414 | 4.3137 | 1320 | 0.0702 | 0.9430 | 0.9520 | 0.9475 | 0.9818 |
|
127 |
+
| 0.047 | 4.3791 | 1340 | 0.0715 | 0.9330 | 0.9570 | 0.9449 | 0.9811 |
|
128 |
+
| 0.0409 | 4.4444 | 1360 | 0.0723 | 0.9314 | 0.9555 | 0.9433 | 0.9807 |
|
129 |
+
| 0.0318 | 4.5098 | 1380 | 0.0736 | 0.9347 | 0.9598 | 0.9471 | 0.9817 |
|
130 |
+
| 0.0459 | 4.5752 | 1400 | 0.0723 | 0.9393 | 0.9583 | 0.9488 | 0.9820 |
|
131 |
+
| 0.0435 | 4.6405 | 1420 | 0.0729 | 0.9332 | 0.9604 | 0.9466 | 0.9812 |
|
132 |
+
| 0.0354 | 4.7059 | 1440 | 0.0745 | 0.9326 | 0.9611 | 0.9467 | 0.9809 |
|
133 |
+
| 0.046 | 4.7712 | 1460 | 0.0747 | 0.9345 | 0.9600 | 0.9471 | 0.9812 |
|
134 |
+
| 0.0418 | 4.8366 | 1480 | 0.0712 | 0.9421 | 0.9631 | 0.9525 | 0.9827 |
|
135 |
+
| 0.0353 | 4.9020 | 1500 | 0.0741 | 0.9337 | 0.9623 | 0.9478 | 0.9814 |
|
136 |
+
| 0.0501 | 4.9673 | 1520 | 0.0727 | 0.9348 | 0.9564 | 0.9455 | 0.9813 |
|
137 |
+
| 0.0354 | 5.0327 | 1540 | 0.0756 | 0.9314 | 0.9588 | 0.9449 | 0.9806 |
|
138 |
+
| 0.0323 | 5.0980 | 1560 | 0.0722 | 0.9382 | 0.9587 | 0.9483 | 0.9820 |
|
139 |
+
| 0.0376 | 5.1634 | 1580 | 0.0732 | 0.9354 | 0.9589 | 0.9470 | 0.9813 |
|
140 |
+
| 0.0323 | 5.2288 | 1600 | 0.0730 | 0.9336 | 0.9564 | 0.9449 | 0.9809 |
|
141 |
+
| 0.0315 | 5.2941 | 1620 | 0.0740 | 0.9342 | 0.9572 | 0.9456 | 0.9808 |
|
142 |
+
| 0.0288 | 5.3595 | 1640 | 0.0728 | 0.9376 | 0.9567 | 0.9470 | 0.9815 |
|
143 |
+
| 0.0353 | 5.4248 | 1660 | 0.0711 | 0.9369 | 0.9563 | 0.9465 | 0.9815 |
|
144 |
+
| 0.0378 | 5.4902 | 1680 | 0.0725 | 0.9379 | 0.9576 | 0.9476 | 0.9820 |
|
145 |
+
| 0.0326 | 5.5556 | 1700 | 0.0710 | 0.9411 | 0.9583 | 0.9497 | 0.9824 |
|
146 |
+
| 0.0349 | 5.6209 | 1720 | 0.0731 | 0.9346 | 0.9562 | 0.9453 | 0.9812 |
|
147 |
+
| 0.0341 | 5.6863 | 1740 | 0.0728 | 0.9361 | 0.9557 | 0.9458 | 0.9813 |
|
148 |
+
| 0.0323 | 5.7516 | 1760 | 0.0729 | 0.9367 | 0.9552 | 0.9459 | 0.9815 |
|
149 |
+
| 0.0293 | 5.8170 | 1780 | 0.0736 | 0.9340 | 0.9556 | 0.9447 | 0.9809 |
|
150 |
+
| 0.0325 | 5.8824 | 1800 | 0.0738 | 0.9348 | 0.9562 | 0.9454 | 0.9812 |
|
151 |
+
| 0.0287 | 5.9477 | 1820 | 0.0732 | 0.9365 | 0.9562 | 0.9462 | 0.9815 |
|
152 |
|
153 |
|
154 |
### Framework versions
|
adapter_model.safetensors
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
size 453150800
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:f9469a3117a91e98c9524985d56e60911b25bafafda7906e2945e23d753ec68c
|
3 |
size 453150800
|
runs/Dec15_19-37-15_5314af941cb3/events.out.tfevents.1734291437.5314af941cb3.1893.2
CHANGED
@@ -1,3 +1,3 @@
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
-
oid sha256:
|
3 |
-
size
|
|
|
1 |
version https://git-lfs.github.com/spec/v1
|
2 |
+
oid sha256:4c27596ebe2e8fdd3e1b5faaf871aa9b422480448f68fb5d6d1fb792423b126d
|
3 |
+
size 68998
|