sars973 commited on
Commit
9950aec
·
verified ·
1 Parent(s): 5f9bfd2

Model save

Browse files
README.md CHANGED
@@ -16,29 +16,29 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [facebook/detr-resnet-101](https://huggingface.co/facebook/detr-resnet-101) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 1.4203
20
- - Map: 0.211
21
- - Map 50: 0.4148
22
- - Map 75: 0.1838
23
- - Map Small: 0.0751
24
- - Map Medium: 0.1771
25
- - Map Large: 0.3113
26
- - Mar 1: 0.2413
27
- - Mar 10: 0.4141
28
- - Mar 100: 0.4369
29
- - Mar Small: 0.1463
30
- - Mar Medium: 0.3822
31
- - Mar Large: 0.589
32
- - Map Coverall: 0.4919
33
- - Mar 100 Coverall: 0.6775
34
- - Map Face Shield: 0.116
35
- - Mar 100 Face Shield: 0.4051
36
- - Map Gloves: 0.126
37
- - Mar 100 Gloves: 0.4067
38
- - Map Goggles: 0.0542
39
- - Mar 100 Goggles: 0.3062
40
- - Map Mask: 0.2667
41
- - Mar 100 Mask: 0.3889
42
 
43
  ## Model description
44
 
@@ -61,7 +61,7 @@ The following hyperparameters were used during training:
61
  - train_batch_size: 8
62
  - eval_batch_size: 8
63
  - seed: 42
64
- - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
65
  - lr_scheduler_type: cosine
66
  - num_epochs: 30
67
 
@@ -69,41 +69,41 @@ The following hyperparameters were used during training:
69
 
70
  | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
71
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
72
- | No log | 1.0 | 107 | 2.4645 | 0.0227 | 0.0572 | 0.016 | 0.004 | 0.0095 | 0.0259 | 0.042 | 0.122 | 0.1707 | 0.0328 | 0.1149 | 0.2212 | 0.1014 | 0.5545 | 0.0 | 0.0 | 0.0031 | 0.1335 | 0.0 | 0.0 | 0.0091 | 0.1653 |
73
- | No log | 2.0 | 214 | 2.1804 | 0.0448 | 0.1044 | 0.0358 | 0.0148 | 0.0316 | 0.0467 | 0.0628 | 0.1692 | 0.2066 | 0.0811 | 0.1648 | 0.2201 | 0.177 | 0.5851 | 0.0 | 0.0 | 0.0159 | 0.2134 | 0.0 | 0.0 | 0.031 | 0.2347 |
74
- | No log | 3.0 | 321 | 2.5284 | 0.0155 | 0.0479 | 0.0083 | 0.0035 | 0.0205 | 0.0156 | 0.0308 | 0.1059 | 0.1257 | 0.0295 | 0.0967 | 0.1519 | 0.0652 | 0.3986 | 0.0 | 0.0 | 0.003 | 0.1237 | 0.0 | 0.0 | 0.0093 | 0.1062 |
75
- | No log | 4.0 | 428 | 2.5653 | 0.0495 | 0.1184 | 0.034 | 0.0134 | 0.0368 | 0.0532 | 0.0689 | 0.1406 | 0.1473 | 0.0536 | 0.1184 | 0.1448 | 0.1843 | 0.4559 | 0.0 | 0.0 | 0.0156 | 0.0982 | 0.0 | 0.0 | 0.0475 | 0.1827 |
76
- | 2.1715 | 5.0 | 535 | 2.0014 | 0.0562 | 0.1325 | 0.0429 | 0.0107 | 0.0506 | 0.062 | 0.071 | 0.1755 | 0.208 | 0.0996 | 0.1643 | 0.2142 | 0.2171 | 0.5464 | 0.004 | 0.0051 | 0.0111 | 0.2196 | 0.0 | 0.0 | 0.0488 | 0.2689 |
77
- | 2.1715 | 6.0 | 642 | 2.0494 | 0.0742 | 0.1766 | 0.0547 | 0.0253 | 0.0715 | 0.0841 | 0.0918 | 0.1821 | 0.1947 | 0.0776 | 0.147 | 0.2081 | 0.2231 | 0.5293 | 0.014 | 0.0101 | 0.037 | 0.1933 | 0.0 | 0.0 | 0.0969 | 0.2409 |
78
- | 2.1715 | 7.0 | 749 | 1.9789 | 0.0762 | 0.1757 | 0.0609 | 0.0175 | 0.0563 | 0.0979 | 0.0947 | 0.1935 | 0.2099 | 0.0754 | 0.14 | 0.2697 | 0.2697 | 0.5482 | 0.0068 | 0.0405 | 0.0326 | 0.2263 | 0.0 | 0.0 | 0.072 | 0.2347 |
79
- | 2.1715 | 8.0 | 856 | 1.7979 | 0.1115 | 0.2421 | 0.0963 | 0.0385 | 0.0838 | 0.1233 | 0.1205 | 0.2413 | 0.2577 | 0.1064 | 0.2004 | 0.2981 | 0.366 | 0.6 | 0.0318 | 0.1734 | 0.0442 | 0.2375 | 0.0 | 0.0 | 0.1154 | 0.2778 |
80
- | 2.1715 | 9.0 | 963 | 1.7814 | 0.106 | 0.2485 | 0.0782 | 0.0298 | 0.0914 | 0.1249 | 0.1282 | 0.2654 | 0.291 | 0.1279 | 0.2449 | 0.3136 | 0.3169 | 0.6149 | 0.0396 | 0.2443 | 0.0369 | 0.2661 | 0.0053 | 0.0277 | 0.1316 | 0.3022 |
81
- | 1.6797 | 10.0 | 1070 | 1.7592 | 0.12 | 0.2686 | 0.09 | 0.0538 | 0.0951 | 0.1547 | 0.1463 | 0.2745 | 0.2955 | 0.1219 | 0.2298 | 0.3636 | 0.348 | 0.5896 | 0.0404 | 0.2418 | 0.0448 | 0.2746 | 0.0066 | 0.0815 | 0.16 | 0.2902 |
82
- | 1.6797 | 11.0 | 1177 | 1.6620 | 0.1444 | 0.3101 | 0.118 | 0.0517 | 0.107 | 0.1972 | 0.1605 | 0.3127 | 0.3286 | 0.109 | 0.2746 | 0.4274 | 0.4222 | 0.636 | 0.0764 | 0.3304 | 0.0504 | 0.2683 | 0.0125 | 0.0846 | 0.1602 | 0.3236 |
83
- | 1.6797 | 12.0 | 1284 | 1.6521 | 0.1496 | 0.3058 | 0.1316 | 0.0668 | 0.1276 | 0.195 | 0.179 | 0.3412 | 0.3661 | 0.1314 | 0.3048 | 0.4696 | 0.3977 | 0.659 | 0.0678 | 0.3316 | 0.0651 | 0.3304 | 0.0155 | 0.1677 | 0.2017 | 0.3418 |
84
- | 1.6797 | 13.0 | 1391 | 1.6103 | 0.1557 | 0.3242 | 0.1339 | 0.0619 | 0.1241 | 0.2161 | 0.1805 | 0.3549 | 0.3805 | 0.1434 | 0.3194 | 0.49 | 0.4414 | 0.6509 | 0.0615 | 0.3734 | 0.0656 | 0.3049 | 0.0273 | 0.2446 | 0.1827 | 0.3284 |
85
- | 1.6797 | 14.0 | 1498 | 1.5562 | 0.1555 | 0.331 | 0.1264 | 0.0774 | 0.1244 | 0.2154 | 0.1817 | 0.3534 | 0.3801 | 0.1578 | 0.3145 | 0.4972 | 0.4226 | 0.6482 | 0.0506 | 0.3278 | 0.0645 | 0.3357 | 0.025 | 0.2231 | 0.215 | 0.3658 |
86
- | 1.4442 | 15.0 | 1605 | 1.5950 | 0.1646 | 0.3338 | 0.1427 | 0.0654 | 0.1243 | 0.2369 | 0.2009 | 0.3721 | 0.3948 | 0.1344 | 0.3278 | 0.5394 | 0.4499 | 0.6252 | 0.0607 | 0.4101 | 0.0827 | 0.354 | 0.0168 | 0.2462 | 0.2129 | 0.3387 |
87
- | 1.4442 | 16.0 | 1712 | 1.5378 | 0.1787 | 0.3643 | 0.1506 | 0.0709 | 0.1506 | 0.2531 | 0.2191 | 0.3931 | 0.4166 | 0.1542 | 0.3543 | 0.5625 | 0.4758 | 0.6631 | 0.0755 | 0.4241 | 0.0807 | 0.3272 | 0.0301 | 0.3092 | 0.2316 | 0.3596 |
88
- | 1.4442 | 17.0 | 1819 | 1.5125 | 0.1918 | 0.3755 | 0.1708 | 0.0788 | 0.1531 | 0.2661 | 0.223 | 0.3882 | 0.413 | 0.1521 | 0.3452 | 0.5525 | 0.4683 | 0.6541 | 0.1277 | 0.3949 | 0.0788 | 0.3647 | 0.0524 | 0.3031 | 0.2319 | 0.348 |
89
- | 1.4442 | 18.0 | 1926 | 1.5578 | 0.1828 | 0.3717 | 0.1554 | 0.0728 | 0.1433 | 0.2664 | 0.2173 | 0.3768 | 0.4014 | 0.1351 | 0.3382 | 0.5489 | 0.4573 | 0.645 | 0.1133 | 0.4013 | 0.0831 | 0.3719 | 0.0463 | 0.2662 | 0.214 | 0.3227 |
90
- | 1.2711 | 19.0 | 2033 | 1.5281 | 0.183 | 0.3667 | 0.1594 | 0.0701 | 0.1424 | 0.269 | 0.2141 | 0.382 | 0.4056 | 0.1545 | 0.3369 | 0.5538 | 0.4556 | 0.6459 | 0.102 | 0.3759 | 0.0856 | 0.3804 | 0.0421 | 0.2723 | 0.2295 | 0.3533 |
91
- | 1.2711 | 20.0 | 2140 | 1.4865 | 0.1904 | 0.3761 | 0.1706 | 0.0691 | 0.1571 | 0.2782 | 0.2229 | 0.3888 | 0.4176 | 0.147 | 0.3621 | 0.5556 | 0.4628 | 0.6491 | 0.1048 | 0.3962 | 0.1006 | 0.3929 | 0.0512 | 0.2923 | 0.2326 | 0.3578 |
92
- | 1.2711 | 21.0 | 2247 | 1.4419 | 0.1998 | 0.3915 | 0.1805 | 0.0764 | 0.1666 | 0.2851 | 0.2175 | 0.402 | 0.426 | 0.1665 | 0.3647 | 0.5649 | 0.484 | 0.6622 | 0.097 | 0.3835 | 0.1045 | 0.4129 | 0.053 | 0.2815 | 0.2604 | 0.3898 |
93
- | 1.2711 | 22.0 | 2354 | 1.4334 | 0.2005 | 0.3988 | 0.1731 | 0.0784 | 0.1593 | 0.2923 | 0.2251 | 0.4072 | 0.4286 | 0.1525 | 0.3701 | 0.5665 | 0.4877 | 0.6662 | 0.1069 | 0.4051 | 0.1085 | 0.392 | 0.0421 | 0.2908 | 0.2574 | 0.3889 |
94
- | 1.2711 | 23.0 | 2461 | 1.4424 | 0.1944 | 0.3864 | 0.1668 | 0.0766 | 0.1589 | 0.2875 | 0.2263 | 0.3953 | 0.4195 | 0.1527 | 0.3595 | 0.5552 | 0.4836 | 0.6676 | 0.0897 | 0.3924 | 0.1088 | 0.3929 | 0.0426 | 0.2738 | 0.2471 | 0.3707 |
95
- | 1.1557 | 24.0 | 2568 | 1.4330 | 0.1985 | 0.3946 | 0.1749 | 0.0721 | 0.1629 | 0.2972 | 0.2302 | 0.4083 | 0.4291 | 0.1512 | 0.3689 | 0.5678 | 0.4795 | 0.6653 | 0.0962 | 0.3899 | 0.1172 | 0.4027 | 0.0416 | 0.3015 | 0.2579 | 0.3862 |
96
- | 1.1557 | 25.0 | 2675 | 1.4414 | 0.2055 | 0.4042 | 0.1823 | 0.0726 | 0.1736 | 0.2998 | 0.2395 | 0.4119 | 0.4336 | 0.1429 | 0.3867 | 0.5743 | 0.4858 | 0.6671 | 0.0996 | 0.4051 | 0.128 | 0.4031 | 0.0504 | 0.3031 | 0.2637 | 0.3898 |
97
- | 1.1557 | 26.0 | 2782 | 1.4282 | 0.2074 | 0.4059 | 0.1823 | 0.0727 | 0.1728 | 0.3052 | 0.2407 | 0.4168 | 0.4361 | 0.1469 | 0.3789 | 0.5882 | 0.4905 | 0.6766 | 0.1144 | 0.4089 | 0.126 | 0.4071 | 0.0521 | 0.3046 | 0.2542 | 0.3831 |
98
- | 1.1557 | 27.0 | 2889 | 1.4217 | 0.2071 | 0.404 | 0.1879 | 0.0721 | 0.1745 | 0.3031 | 0.2395 | 0.4148 | 0.4359 | 0.1493 | 0.3809 | 0.5853 | 0.4924 | 0.6806 | 0.115 | 0.4051 | 0.1235 | 0.4134 | 0.0481 | 0.2954 | 0.2564 | 0.3849 |
99
- | 1.1557 | 28.0 | 2996 | 1.4227 | 0.21 | 0.4135 | 0.1839 | 0.0737 | 0.1748 | 0.309 | 0.2423 | 0.4133 | 0.4385 | 0.1474 | 0.3828 | 0.5933 | 0.4903 | 0.6757 | 0.1144 | 0.4051 | 0.1268 | 0.4089 | 0.0532 | 0.3108 | 0.2652 | 0.392 |
100
- | 1.0809 | 29.0 | 3103 | 1.4192 | 0.2106 | 0.4139 | 0.1833 | 0.0746 | 0.1768 | 0.3106 | 0.2414 | 0.4134 | 0.4373 | 0.146 | 0.3822 | 0.5903 | 0.4915 | 0.6779 | 0.1161 | 0.4051 | 0.1261 | 0.4076 | 0.0537 | 0.3077 | 0.2654 | 0.388 |
101
- | 1.0809 | 30.0 | 3210 | 1.4203 | 0.211 | 0.4148 | 0.1838 | 0.0751 | 0.1771 | 0.3113 | 0.2413 | 0.4141 | 0.4369 | 0.1463 | 0.3822 | 0.589 | 0.4919 | 0.6775 | 0.116 | 0.4051 | 0.126 | 0.4067 | 0.0542 | 0.3062 | 0.2667 | 0.3889 |
102
 
103
 
104
  ### Framework versions
105
 
106
- - Transformers 4.47.1
107
- - Pytorch 2.5.1+cu121
108
- - Datasets 3.2.0
109
  - Tokenizers 0.21.0
 
16
 
17
  This model is a fine-tuned version of [facebook/detr-resnet-101](https://huggingface.co/facebook/detr-resnet-101) on an unknown dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 1.3826
20
+ - Map: 0.2211
21
+ - Map 50: 0.4433
22
+ - Map 75: 0.1865
23
+ - Map Small: 0.085
24
+ - Map Medium: 0.1852
25
+ - Map Large: 0.3102
26
+ - Mar 1: 0.2358
27
+ - Mar 10: 0.4196
28
+ - Mar 100: 0.4395
29
+ - Mar Small: 0.1924
30
+ - Mar Medium: 0.3868
31
+ - Mar Large: 0.5751
32
+ - Map Coverall: 0.52
33
+ - Mar 100 Coverall: 0.677
34
+ - Map Face Shield: 0.1426
35
+ - Mar 100 Face Shield: 0.3924
36
+ - Map Gloves: 0.1344
37
+ - Mar 100 Gloves: 0.4192
38
+ - Map Goggles: 0.0746
39
+ - Mar 100 Goggles: 0.3277
40
+ - Map Mask: 0.2339
41
+ - Mar 100 Mask: 0.3813
42
 
43
  ## Model description
44
 
 
61
  - train_batch_size: 8
62
  - eval_batch_size: 8
63
  - seed: 42
64
+ - optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
65
  - lr_scheduler_type: cosine
66
  - num_epochs: 30
67
 
 
69
 
70
  | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
71
  |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
72
+ | No log | 1.0 | 107 | 2.4478 | 0.0132 | 0.0346 | 0.0085 | 0.0053 | 0.0142 | 0.013 | 0.0439 | 0.1138 | 0.1523 | 0.0478 | 0.1121 | 0.1613 | 0.0558 | 0.486 | 0.0 | 0.0 | 0.0029 | 0.1638 | 0.0 | 0.0 | 0.0073 | 0.1116 |
73
+ | No log | 2.0 | 214 | 2.7727 | 0.009 | 0.028 | 0.0041 | 0.0058 | 0.0185 | 0.0072 | 0.0337 | 0.0875 | 0.1368 | 0.0323 | 0.1066 | 0.1435 | 0.0264 | 0.4171 | 0.0 | 0.0 | 0.0113 | 0.1304 | 0.0 | 0.0 | 0.0073 | 0.1364 |
74
+ | No log | 3.0 | 321 | 2.6495 | 0.0134 | 0.044 | 0.0056 | 0.0068 | 0.0143 | 0.0141 | 0.0294 | 0.1063 | 0.1269 | 0.0467 | 0.1156 | 0.1424 | 0.0355 | 0.3099 | 0.0 | 0.0 | 0.0126 | 0.1464 | 0.0 | 0.0 | 0.019 | 0.1782 |
75
+ | No log | 4.0 | 428 | 2.1866 | 0.0354 | 0.0933 | 0.0234 | 0.0103 | 0.0208 | 0.0367 | 0.0535 | 0.1521 | 0.1742 | 0.0574 | 0.1331 | 0.1925 | 0.1323 | 0.4878 | 0.0 | 0.0 | 0.015 | 0.1871 | 0.0 | 0.0 | 0.0296 | 0.196 |
76
+ | 2.1646 | 5.0 | 535 | 1.9950 | 0.0496 | 0.1249 | 0.0335 | 0.0092 | 0.0338 | 0.057 | 0.0665 | 0.1721 | 0.2044 | 0.0729 | 0.1567 | 0.2354 | 0.1684 | 0.5369 | 0.0025 | 0.0114 | 0.0244 | 0.2201 | 0.0 | 0.0 | 0.0527 | 0.2533 |
77
+ | 2.1646 | 6.0 | 642 | 2.0231 | 0.0467 | 0.1218 | 0.0301 | 0.0146 | 0.0459 | 0.0611 | 0.0725 | 0.1786 | 0.2064 | 0.078 | 0.1636 | 0.2389 | 0.1419 | 0.5495 | 0.0018 | 0.0342 | 0.0194 | 0.2076 | 0.0 | 0.0 | 0.0705 | 0.2404 |
78
+ | 2.1646 | 7.0 | 749 | 1.9400 | 0.0551 | 0.1336 | 0.04 | 0.022 | 0.0467 | 0.0727 | 0.0946 | 0.1933 | 0.2233 | 0.1013 | 0.1747 | 0.2636 | 0.1717 | 0.5207 | 0.0075 | 0.0823 | 0.0229 | 0.2455 | 0.0059 | 0.0046 | 0.0673 | 0.2636 |
79
+ | 2.1646 | 8.0 | 856 | 1.9149 | 0.07 | 0.1803 | 0.0523 | 0.037 | 0.0521 | 0.0903 | 0.0986 | 0.1937 | 0.223 | 0.0892 | 0.1663 | 0.2716 | 0.2104 | 0.5694 | 0.0096 | 0.062 | 0.03 | 0.229 | 0.0015 | 0.0077 | 0.0987 | 0.2467 |
80
+ | 2.1646 | 9.0 | 963 | 1.8614 | 0.0893 | 0.2058 | 0.0651 | 0.0248 | 0.0795 | 0.1176 | 0.1088 | 0.2341 | 0.2559 | 0.1075 | 0.2323 | 0.2872 | 0.259 | 0.5509 | 0.0201 | 0.1684 | 0.0383 | 0.2545 | 0.0006 | 0.0108 | 0.1285 | 0.2951 |
81
+ | 1.7395 | 10.0 | 1070 | 1.7835 | 0.1106 | 0.2638 | 0.0788 | 0.0315 | 0.1011 | 0.1466 | 0.1414 | 0.2696 | 0.297 | 0.1256 | 0.266 | 0.3475 | 0.2673 | 0.5446 | 0.0612 | 0.2494 | 0.0443 | 0.2728 | 0.015 | 0.0969 | 0.1653 | 0.3213 |
82
+ | 1.7395 | 11.0 | 1177 | 1.7088 | 0.1197 | 0.2811 | 0.0839 | 0.064 | 0.1103 | 0.1504 | 0.1524 | 0.2979 | 0.316 | 0.1231 | 0.2602 | 0.4093 | 0.314 | 0.605 | 0.0536 | 0.2557 | 0.0512 | 0.2719 | 0.0192 | 0.16 | 0.1608 | 0.2876 |
83
+ | 1.7395 | 12.0 | 1284 | 1.6845 | 0.1454 | 0.3145 | 0.1206 | 0.0385 | 0.1227 | 0.195 | 0.178 | 0.3173 | 0.3421 | 0.1181 | 0.294 | 0.443 | 0.3877 | 0.6401 | 0.0626 | 0.2772 | 0.0753 | 0.3272 | 0.0239 | 0.14 | 0.1772 | 0.3258 |
84
+ | 1.7395 | 13.0 | 1391 | 1.6600 | 0.1447 | 0.3095 | 0.1262 | 0.0529 | 0.1245 | 0.199 | 0.1701 | 0.3301 | 0.3528 | 0.11 | 0.3093 | 0.4715 | 0.4091 | 0.6437 | 0.0627 | 0.2949 | 0.0596 | 0.3259 | 0.0181 | 0.1815 | 0.1741 | 0.3178 |
85
+ | 1.7395 | 14.0 | 1498 | 1.5611 | 0.158 | 0.3529 | 0.1249 | 0.0614 | 0.1434 | 0.2098 | 0.1968 | 0.3545 | 0.3776 | 0.14 | 0.3364 | 0.4679 | 0.3879 | 0.6266 | 0.1051 | 0.3519 | 0.0666 | 0.346 | 0.039 | 0.2231 | 0.1915 | 0.3404 |
86
+ | 1.4537 | 15.0 | 1605 | 1.6226 | 0.1779 | 0.3643 | 0.1587 | 0.0609 | 0.1449 | 0.2662 | 0.2166 | 0.3661 | 0.3837 | 0.1291 | 0.3141 | 0.5235 | 0.4187 | 0.6302 | 0.0943 | 0.3519 | 0.0902 | 0.3384 | 0.0598 | 0.2385 | 0.2264 | 0.3596 |
87
+ | 1.4537 | 16.0 | 1712 | 1.5840 | 0.1641 | 0.3602 | 0.1287 | 0.0592 | 0.1294 | 0.2399 | 0.2036 | 0.3643 | 0.3828 | 0.1486 | 0.3242 | 0.4985 | 0.3993 | 0.6059 | 0.1291 | 0.3722 | 0.0768 | 0.35 | 0.0347 | 0.2631 | 0.1805 | 0.3227 |
88
+ | 1.4537 | 17.0 | 1819 | 1.4955 | 0.1812 | 0.3855 | 0.1458 | 0.0679 | 0.1367 | 0.27 | 0.2093 | 0.3732 | 0.3949 | 0.1526 | 0.3293 | 0.5173 | 0.4528 | 0.6568 | 0.1031 | 0.3468 | 0.0958 | 0.3754 | 0.051 | 0.2385 | 0.2033 | 0.3569 |
89
+ | 1.4537 | 18.0 | 1926 | 1.4729 | 0.1899 | 0.4035 | 0.1552 | 0.066 | 0.1613 | 0.2749 | 0.2106 | 0.3969 | 0.4252 | 0.1691 | 0.3685 | 0.557 | 0.4587 | 0.6775 | 0.1148 | 0.3911 | 0.1048 | 0.3982 | 0.0525 | 0.2938 | 0.2187 | 0.3653 |
90
+ | 1.2794 | 19.0 | 2033 | 1.4837 | 0.2061 | 0.423 | 0.1807 | 0.0724 | 0.1669 | 0.2828 | 0.226 | 0.4066 | 0.4308 | 0.1742 | 0.3779 | 0.5417 | 0.4828 | 0.6698 | 0.1354 | 0.3886 | 0.1249 | 0.3853 | 0.0581 | 0.3523 | 0.2291 | 0.3582 |
91
+ | 1.2794 | 20.0 | 2140 | 1.4320 | 0.2055 | 0.4205 | 0.1675 | 0.0796 | 0.1685 | 0.2868 | 0.225 | 0.4086 | 0.4311 | 0.1831 | 0.3727 | 0.5552 | 0.4733 | 0.6545 | 0.1472 | 0.4051 | 0.1114 | 0.3969 | 0.0687 | 0.3338 | 0.2268 | 0.3653 |
92
+ | 1.2794 | 21.0 | 2247 | 1.3978 | 0.2094 | 0.4321 | 0.1701 | 0.0773 | 0.1815 | 0.2968 | 0.2275 | 0.4136 | 0.436 | 0.2046 | 0.3843 | 0.5531 | 0.4764 | 0.6599 | 0.1377 | 0.3759 | 0.1259 | 0.4321 | 0.0765 | 0.3385 | 0.2307 | 0.3733 |
93
+ | 1.2794 | 22.0 | 2354 | 1.3970 | 0.2025 | 0.4224 | 0.1652 | 0.0741 | 0.1656 | 0.2877 | 0.2254 | 0.4152 | 0.4381 | 0.2127 | 0.3905 | 0.5433 | 0.4904 | 0.6608 | 0.1122 | 0.4051 | 0.1111 | 0.4107 | 0.06 | 0.3369 | 0.2389 | 0.3769 |
94
+ | 1.2794 | 23.0 | 2461 | 1.4207 | 0.2095 | 0.4378 | 0.1769 | 0.0663 | 0.1792 | 0.3028 | 0.233 | 0.4145 | 0.4326 | 0.1553 | 0.3903 | 0.5597 | 0.4918 | 0.6752 | 0.1181 | 0.3911 | 0.1302 | 0.4085 | 0.0771 | 0.3108 | 0.2304 | 0.3773 |
95
+ | 1.1354 | 24.0 | 2568 | 1.3942 | 0.214 | 0.4383 | 0.1754 | 0.0763 | 0.1737 | 0.3056 | 0.2244 | 0.4212 | 0.4427 | 0.1957 | 0.3859 | 0.5632 | 0.5148 | 0.6847 | 0.1354 | 0.4025 | 0.129 | 0.4192 | 0.0722 | 0.3323 | 0.2185 | 0.3747 |
96
+ | 1.1354 | 25.0 | 2675 | 1.3834 | 0.214 | 0.4377 | 0.1816 | 0.0783 | 0.182 | 0.3042 | 0.2307 | 0.4175 | 0.4352 | 0.1911 | 0.3845 | 0.5586 | 0.5116 | 0.6806 | 0.1269 | 0.3924 | 0.1278 | 0.4049 | 0.0702 | 0.3185 | 0.2336 | 0.3796 |
97
+ | 1.1354 | 26.0 | 2782 | 1.3870 | 0.2169 | 0.4399 | 0.1832 | 0.0788 | 0.1837 | 0.3056 | 0.2386 | 0.418 | 0.4372 | 0.1865 | 0.3921 | 0.5633 | 0.5093 | 0.6743 | 0.1476 | 0.3962 | 0.1273 | 0.4054 | 0.0698 | 0.3308 | 0.2303 | 0.3796 |
98
+ | 1.1354 | 27.0 | 2889 | 1.3859 | 0.219 | 0.4387 | 0.1816 | 0.0818 | 0.1826 | 0.3069 | 0.2352 | 0.4177 | 0.4376 | 0.1808 | 0.3857 | 0.5744 | 0.5118 | 0.6707 | 0.1416 | 0.3861 | 0.1355 | 0.4237 | 0.0727 | 0.3246 | 0.2336 | 0.3831 |
99
+ | 1.1354 | 28.0 | 2996 | 1.3814 | 0.2218 | 0.445 | 0.1878 | 0.0827 | 0.1873 | 0.311 | 0.2385 | 0.4208 | 0.44 | 0.1915 | 0.3871 | 0.5735 | 0.5222 | 0.677 | 0.1423 | 0.3962 | 0.1352 | 0.4174 | 0.0744 | 0.3262 | 0.2349 | 0.3831 |
100
+ | 1.0569 | 29.0 | 3103 | 1.3851 | 0.2213 | 0.4429 | 0.1867 | 0.0848 | 0.1858 | 0.3108 | 0.2356 | 0.4204 | 0.4396 | 0.1912 | 0.3876 | 0.5737 | 0.5193 | 0.6761 | 0.1433 | 0.3962 | 0.1343 | 0.4174 | 0.0744 | 0.3262 | 0.235 | 0.3822 |
101
+ | 1.0569 | 30.0 | 3210 | 1.3826 | 0.2211 | 0.4433 | 0.1865 | 0.085 | 0.1852 | 0.3102 | 0.2358 | 0.4196 | 0.4395 | 0.1924 | 0.3868 | 0.5751 | 0.52 | 0.677 | 0.1426 | 0.3924 | 0.1344 | 0.4192 | 0.0746 | 0.3277 | 0.2339 | 0.3813 |
102
 
103
 
104
  ### Framework versions
105
 
106
+ - Transformers 4.48.3
107
+ - Pytorch 2.5.1+cu124
108
+ - Datasets 3.3.1
109
  - Tokenizers 0.21.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:27c38d5e0bfa5ab248306db3ad0ea08f72dca18ce63bbd63f89981672d9904f5
3
  size 242708664
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:e210d9e918b6bd37381c8f64688283c4715ed74b17089bd808d179089fbc0b8a
3
  size 242708664
runs/Feb20_08-24-38_9d54af9a211b/events.out.tfevents.1740039918.9d54af9a211b.6619.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:e473f7574e68d61d78a1a0da99ea704fb4a9fb2a9f4582db672bb2c6f461c615
3
- size 49500
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:552542d5de2686df39892180b7d0aab5808d4564d491d53989b3b1dfd4e9c041
3
+ size 51314
runs/Feb20_08-24-38_9d54af9a211b/events.out.tfevents.1740041433.9d54af9a211b.6619.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:de2825db8394efc6782db0130b16610edefa54dd2f0dcfc7683ba5e056f7a29d
3
+ size 4468