sars973 commited on
Commit
c288a46
·
verified ·
1 Parent(s): 972f610

End of training

Browse files
README.md ADDED
@@ -0,0 +1,109 @@
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
+ ---
2
+ library_name: transformers
3
+ license: apache-2.0
4
+ base_model: facebook/detr-resnet-101
5
+ tags:
6
+ - generated_from_trainer
7
+ model-index:
8
+ - name: detr_finetuned_cppe5
9
+ results: []
10
+ ---
11
+
12
+ <!-- This model card has been generated automatically according to the information the Trainer had access to. You
13
+ should probably proofread and complete it, then remove this comment. -->
14
+
15
+ # detr_finetuned_cppe5
16
+
17
+ This model is a fine-tuned version of [facebook/detr-resnet-101](https://huggingface.co/facebook/detr-resnet-101) on an unknown dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 1.4034
20
+ - Map: 0.2188
21
+ - Map 50: 0.4403
22
+ - Map 75: 0.1997
23
+ - Map Small: 0.08
24
+ - Map Medium: 0.1889
25
+ - Map Large: 0.3172
26
+ - Mar 1: 0.2463
27
+ - Mar 10: 0.4367
28
+ - Mar 100: 0.4531
29
+ - Mar Small: 0.1789
30
+ - Mar Medium: 0.4042
31
+ - Mar Large: 0.5944
32
+ - Map Coverall: 0.4796
33
+ - Mar 100 Coverall: 0.6545
34
+ - Map Face Shield: 0.1324
35
+ - Mar 100 Face Shield: 0.4152
36
+ - Map Gloves: 0.1688
37
+ - Mar 100 Gloves: 0.4415
38
+ - Map Goggles: 0.0598
39
+ - Mar 100 Goggles: 0.3662
40
+ - Map Mask: 0.2533
41
+ - Mar 100 Mask: 0.388
42
+
43
+ ## Model description
44
+
45
+ More information needed
46
+
47
+ ## Intended uses & limitations
48
+
49
+ More information needed
50
+
51
+ ## Training and evaluation data
52
+
53
+ More information needed
54
+
55
+ ## Training procedure
56
+
57
+ ### Training hyperparameters
58
+
59
+ The following hyperparameters were used during training:
60
+ - learning_rate: 5e-05
61
+ - train_batch_size: 8
62
+ - eval_batch_size: 8
63
+ - seed: 42
64
+ - optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
65
+ - lr_scheduler_type: cosine
66
+ - num_epochs: 30
67
+
68
+ ### Training results
69
+
70
+ | Training Loss | Epoch | Step | Validation Loss | Map | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1 | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
71
+ |:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
72
+ | No log | 1.0 | 107 | 2.5957 | 0.0222 | 0.0479 | 0.0199 | 0.0025 | 0.0245 | 0.0253 | 0.0452 | 0.1123 | 0.1379 | 0.0426 | 0.101 | 0.1796 | 0.0923 | 0.3887 | 0.0 | 0.0 | 0.006 | 0.117 | 0.0 | 0.0 | 0.0127 | 0.1836 |
73
+ | No log | 2.0 | 214 | 2.1891 | 0.0253 | 0.062 | 0.0176 | 0.009 | 0.0244 | 0.0255 | 0.0559 | 0.1512 | 0.1926 | 0.0904 | 0.1489 | 0.2059 | 0.0997 | 0.5279 | 0.0 | 0.0 | 0.0078 | 0.204 | 0.0 | 0.0 | 0.0188 | 0.2311 |
74
+ | No log | 3.0 | 321 | 2.2235 | 0.0303 | 0.0819 | 0.0181 | 0.0095 | 0.031 | 0.0305 | 0.0484 | 0.1407 | 0.1743 | 0.0702 | 0.1395 | 0.1989 | 0.1186 | 0.4667 | 0.0 | 0.0 | 0.0055 | 0.1625 | 0.0 | 0.0 | 0.0276 | 0.2422 |
75
+ | No log | 4.0 | 428 | 2.3783 | 0.0372 | 0.0971 | 0.0242 | 0.0055 | 0.0273 | 0.0397 | 0.0551 | 0.1154 | 0.1375 | 0.0482 | 0.1097 | 0.1456 | 0.1679 | 0.3941 | 0.0 | 0.0 | 0.0052 | 0.1491 | 0.0 | 0.0 | 0.013 | 0.1444 |
76
+ | 2.1391 | 5.0 | 535 | 2.1009 | 0.0431 | 0.1221 | 0.0279 | 0.0132 | 0.0452 | 0.0558 | 0.0776 | 0.1594 | 0.1825 | 0.05 | 0.1375 | 0.2131 | 0.1314 | 0.5027 | 0.0035 | 0.0114 | 0.0287 | 0.1929 | 0.0 | 0.0 | 0.0519 | 0.2053 |
77
+ | 2.1391 | 6.0 | 642 | 1.9217 | 0.0747 | 0.1698 | 0.0542 | 0.016 | 0.0705 | 0.0871 | 0.0991 | 0.2115 | 0.2292 | 0.0837 | 0.1963 | 0.2613 | 0.2409 | 0.5486 | 0.0157 | 0.0759 | 0.0267 | 0.2272 | 0.0 | 0.0 | 0.0902 | 0.2942 |
78
+ | 2.1391 | 7.0 | 749 | 1.8120 | 0.0979 | 0.2292 | 0.0721 | 0.0459 | 0.0843 | 0.1224 | 0.1189 | 0.2312 | 0.2519 | 0.1029 | 0.185 | 0.305 | 0.2764 | 0.6072 | 0.0317 | 0.1329 | 0.0573 | 0.2473 | 0.0006 | 0.0123 | 0.1235 | 0.26 |
79
+ | 2.1391 | 8.0 | 856 | 1.7772 | 0.1143 | 0.2514 | 0.0859 | 0.0542 | 0.1032 | 0.1291 | 0.1354 | 0.2623 | 0.2802 | 0.1133 | 0.2414 | 0.3081 | 0.3326 | 0.586 | 0.0228 | 0.1962 | 0.0537 | 0.2621 | 0.0047 | 0.0569 | 0.1579 | 0.3 |
80
+ | 2.1391 | 9.0 | 963 | 1.8144 | 0.113 | 0.241 | 0.0953 | 0.033 | 0.1015 | 0.1548 | 0.1359 | 0.2526 | 0.2747 | 0.085 | 0.2129 | 0.3444 | 0.3265 | 0.6144 | 0.0247 | 0.1266 | 0.0667 | 0.2812 | 0.0054 | 0.0523 | 0.1417 | 0.2991 |
81
+ | 1.6166 | 10.0 | 1070 | 1.7206 | 0.1323 | 0.3038 | 0.1002 | 0.0364 | 0.1282 | 0.1583 | 0.1603 | 0.3134 | 0.3358 | 0.1358 | 0.2938 | 0.4037 | 0.3642 | 0.5856 | 0.0707 | 0.3253 | 0.0737 | 0.3366 | 0.0116 | 0.1677 | 0.1414 | 0.264 |
82
+ | 1.6166 | 11.0 | 1177 | 1.7083 | 0.1276 | 0.2896 | 0.0967 | 0.0388 | 0.0993 | 0.1729 | 0.1609 | 0.2963 | 0.3151 | 0.1067 | 0.2524 | 0.4274 | 0.3697 | 0.595 | 0.0286 | 0.2291 | 0.068 | 0.2879 | 0.0199 | 0.1754 | 0.1518 | 0.288 |
83
+ | 1.6166 | 12.0 | 1284 | 1.6834 | 0.1449 | 0.3148 | 0.1108 | 0.0571 | 0.1279 | 0.1974 | 0.1737 | 0.3345 | 0.3563 | 0.1148 | 0.3097 | 0.4659 | 0.3718 | 0.6117 | 0.0711 | 0.3316 | 0.0794 | 0.3254 | 0.0157 | 0.1954 | 0.1867 | 0.3173 |
84
+ | 1.6166 | 13.0 | 1391 | 1.6155 | 0.1529 | 0.3458 | 0.1175 | 0.0616 | 0.134 | 0.1933 | 0.1727 | 0.3415 | 0.3575 | 0.1559 | 0.3007 | 0.4476 | 0.3808 | 0.5901 | 0.0938 | 0.319 | 0.0757 | 0.3192 | 0.0223 | 0.24 | 0.1918 | 0.3191 |
85
+ | 1.6166 | 14.0 | 1498 | 1.4882 | 0.1804 | 0.4013 | 0.1455 | 0.0763 | 0.1659 | 0.2394 | 0.2082 | 0.3821 | 0.4055 | 0.1954 | 0.376 | 0.4893 | 0.4026 | 0.6351 | 0.1158 | 0.3367 | 0.0936 | 0.3625 | 0.0631 | 0.3062 | 0.227 | 0.3871 |
86
+ | 1.3831 | 15.0 | 1605 | 1.5569 | 0.1659 | 0.3549 | 0.1344 | 0.0649 | 0.1433 | 0.2197 | 0.1954 | 0.3651 | 0.3902 | 0.1782 | 0.3434 | 0.4844 | 0.4207 | 0.6329 | 0.0965 | 0.3608 | 0.099 | 0.3808 | 0.0267 | 0.2508 | 0.1865 | 0.3258 |
87
+ | 1.3831 | 16.0 | 1712 | 1.4920 | 0.1901 | 0.3959 | 0.1731 | 0.068 | 0.165 | 0.2637 | 0.2107 | 0.3818 | 0.404 | 0.1709 | 0.364 | 0.5193 | 0.4609 | 0.6532 | 0.1177 | 0.3278 | 0.1078 | 0.354 | 0.0381 | 0.3092 | 0.226 | 0.3756 |
88
+ | 1.3831 | 17.0 | 1819 | 1.4959 | 0.1778 | 0.3692 | 0.153 | 0.0729 | 0.153 | 0.2576 | 0.2103 | 0.3841 | 0.4024 | 0.166 | 0.344 | 0.5385 | 0.4304 | 0.6545 | 0.1027 | 0.3646 | 0.1167 | 0.3688 | 0.0433 | 0.2923 | 0.1958 | 0.332 |
89
+ | 1.3831 | 18.0 | 1926 | 1.4860 | 0.1773 | 0.3851 | 0.143 | 0.0803 | 0.1557 | 0.2503 | 0.2016 | 0.3869 | 0.408 | 0.178 | 0.3553 | 0.5357 | 0.4292 | 0.6541 | 0.1094 | 0.3544 | 0.1216 | 0.3759 | 0.0294 | 0.3323 | 0.1969 | 0.3236 |
90
+ | 1.199 | 19.0 | 2033 | 1.4450 | 0.1928 | 0.4098 | 0.1659 | 0.0795 | 0.1614 | 0.2684 | 0.2191 | 0.4086 | 0.4296 | 0.1775 | 0.3737 | 0.5657 | 0.4464 | 0.6568 | 0.1216 | 0.4177 | 0.1339 | 0.3879 | 0.0385 | 0.3338 | 0.2238 | 0.3516 |
91
+ | 1.199 | 20.0 | 2140 | 1.4370 | 0.2055 | 0.4167 | 0.1871 | 0.0939 | 0.1832 | 0.2768 | 0.2299 | 0.4268 | 0.4501 | 0.2211 | 0.3888 | 0.584 | 0.4478 | 0.6414 | 0.1407 | 0.4418 | 0.148 | 0.4187 | 0.044 | 0.3692 | 0.247 | 0.3791 |
92
+ | 1.199 | 21.0 | 2247 | 1.4372 | 0.2137 | 0.438 | 0.1795 | 0.0881 | 0.1812 | 0.3067 | 0.2359 | 0.4332 | 0.4528 | 0.2222 | 0.3924 | 0.5957 | 0.4607 | 0.6523 | 0.152 | 0.4025 | 0.1584 | 0.4232 | 0.0612 | 0.4031 | 0.2361 | 0.3831 |
93
+ | 1.199 | 22.0 | 2354 | 1.4418 | 0.2104 | 0.4147 | 0.2017 | 0.0772 | 0.1752 | 0.3074 | 0.2405 | 0.4256 | 0.4414 | 0.1752 | 0.3699 | 0.6025 | 0.476 | 0.6581 | 0.1244 | 0.3911 | 0.1474 | 0.4201 | 0.0573 | 0.3554 | 0.2467 | 0.3822 |
94
+ | 1.199 | 23.0 | 2461 | 1.4337 | 0.2095 | 0.425 | 0.1827 | 0.0662 | 0.1827 | 0.3063 | 0.2347 | 0.4216 | 0.4426 | 0.1679 | 0.381 | 0.5999 | 0.4662 | 0.659 | 0.1335 | 0.4025 | 0.1548 | 0.4156 | 0.0569 | 0.3662 | 0.236 | 0.3698 |
95
+ | 1.0916 | 24.0 | 2568 | 1.3970 | 0.2184 | 0.4362 | 0.1992 | 0.0847 | 0.1814 | 0.3202 | 0.2391 | 0.4398 | 0.4564 | 0.1974 | 0.3876 | 0.6128 | 0.4789 | 0.6676 | 0.1366 | 0.4089 | 0.1601 | 0.442 | 0.0626 | 0.3769 | 0.2536 | 0.3867 |
96
+ | 1.0916 | 25.0 | 2675 | 1.4135 | 0.2198 | 0.4379 | 0.1938 | 0.0723 | 0.1927 | 0.3118 | 0.2453 | 0.4333 | 0.4531 | 0.1876 | 0.394 | 0.5976 | 0.4783 | 0.6572 | 0.1327 | 0.4038 | 0.1631 | 0.4393 | 0.0628 | 0.3708 | 0.2622 | 0.3947 |
97
+ | 1.0916 | 26.0 | 2782 | 1.4002 | 0.2197 | 0.4414 | 0.1969 | 0.0884 | 0.1883 | 0.314 | 0.2477 | 0.4374 | 0.4576 | 0.192 | 0.4035 | 0.6002 | 0.4784 | 0.6595 | 0.1312 | 0.4076 | 0.1609 | 0.4366 | 0.0677 | 0.3908 | 0.2605 | 0.3938 |
98
+ | 1.0916 | 27.0 | 2889 | 1.4037 | 0.218 | 0.4408 | 0.1984 | 0.0801 | 0.1856 | 0.3174 | 0.2457 | 0.4373 | 0.4545 | 0.1792 | 0.406 | 0.5993 | 0.4792 | 0.6568 | 0.1307 | 0.4152 | 0.1645 | 0.4366 | 0.0593 | 0.3708 | 0.2564 | 0.3933 |
99
+ | 1.0916 | 28.0 | 2996 | 1.4060 | 0.2185 | 0.4402 | 0.2012 | 0.0804 | 0.1864 | 0.3183 | 0.2495 | 0.4357 | 0.4535 | 0.1795 | 0.4038 | 0.5942 | 0.4799 | 0.6568 | 0.1307 | 0.4127 | 0.1662 | 0.4402 | 0.0609 | 0.3708 | 0.2549 | 0.3871 |
100
+ | 1.0159 | 29.0 | 3103 | 1.4032 | 0.2185 | 0.44 | 0.1995 | 0.0797 | 0.1867 | 0.3168 | 0.246 | 0.4375 | 0.454 | 0.1787 | 0.4065 | 0.5952 | 0.4795 | 0.6527 | 0.1328 | 0.419 | 0.1682 | 0.4411 | 0.0599 | 0.3708 | 0.2524 | 0.3867 |
101
+ | 1.0159 | 30.0 | 3210 | 1.4034 | 0.2188 | 0.4403 | 0.1997 | 0.08 | 0.1889 | 0.3172 | 0.2463 | 0.4367 | 0.4531 | 0.1789 | 0.4042 | 0.5944 | 0.4796 | 0.6545 | 0.1324 | 0.4152 | 0.1688 | 0.4415 | 0.0598 | 0.3662 | 0.2533 | 0.388 |
102
+
103
+
104
+ ### Framework versions
105
+
106
+ - Transformers 4.47.1
107
+ - Pytorch 2.5.1+cu121
108
+ - Datasets 3.2.0
109
+ - Tokenizers 0.21.0
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:a48acb497b7073cd9adc8b2dd131e86b40e64517936befcff5229f942010b38a
3
  size 242708664
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:868d98b911484a20f7a6d8420d3fcbc2dc86e7f245422dcbce29ab6c0fc11fd7
3
  size 242708664
runs/Jan08_13-57-42_837f86c4c593/events.out.tfevents.1736345031.837f86c4c593.363.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:42aa54f39b92874290171ba19b02bdb150c5944956f65b9b0877e6747ac47b35
3
- size 49500
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:6cf96e294bbaa19fcf8c8383e5184b629802ec95b07d58d6d8e5a3cd87944be3
3
+ size 51314
runs/Jan08_13-57-42_837f86c4c593/events.out.tfevents.1736354281.837f86c4c593.363.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:18b2d5918c3ad0ccbaefcd2cfcb51d73627e163e2cb22b08fd09159af38c03da
3
+ size 1548