File size: 12,653 Bytes
c288a46
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
library_name: transformers
license: apache-2.0
base_model: facebook/detr-resnet-101
tags:
- generated_from_trainer
model-index:
- name: detr_finetuned_cppe5
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# detr_finetuned_cppe5

This model is a fine-tuned version of [facebook/detr-resnet-101](https://huggingface.co/facebook/detr-resnet-101) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.4034
- Map: 0.2188
- Map 50: 0.4403
- Map 75: 0.1997
- Map Small: 0.08
- Map Medium: 0.1889
- Map Large: 0.3172
- Mar 1: 0.2463
- Mar 10: 0.4367
- Mar 100: 0.4531
- Mar Small: 0.1789
- Mar Medium: 0.4042
- Mar Large: 0.5944
- Map Coverall: 0.4796
- Mar 100 Coverall: 0.6545
- Map Face Shield: 0.1324
- Mar 100 Face Shield: 0.4152
- Map Gloves: 0.1688
- Mar 100 Gloves: 0.4415
- Map Goggles: 0.0598
- Mar 100 Goggles: 0.3662
- Map Mask: 0.2533
- Mar 100 Mask: 0.388

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log        | 1.0   | 107  | 2.5957          | 0.0222 | 0.0479 | 0.0199 | 0.0025    | 0.0245     | 0.0253    | 0.0452 | 0.1123 | 0.1379  | 0.0426    | 0.101      | 0.1796    | 0.0923       | 0.3887           | 0.0             | 0.0                 | 0.006      | 0.117          | 0.0         | 0.0             | 0.0127   | 0.1836       |
| No log        | 2.0   | 214  | 2.1891          | 0.0253 | 0.062  | 0.0176 | 0.009     | 0.0244     | 0.0255    | 0.0559 | 0.1512 | 0.1926  | 0.0904    | 0.1489     | 0.2059    | 0.0997       | 0.5279           | 0.0             | 0.0                 | 0.0078     | 0.204          | 0.0         | 0.0             | 0.0188   | 0.2311       |
| No log        | 3.0   | 321  | 2.2235          | 0.0303 | 0.0819 | 0.0181 | 0.0095    | 0.031      | 0.0305    | 0.0484 | 0.1407 | 0.1743  | 0.0702    | 0.1395     | 0.1989    | 0.1186       | 0.4667           | 0.0             | 0.0                 | 0.0055     | 0.1625         | 0.0         | 0.0             | 0.0276   | 0.2422       |
| No log        | 4.0   | 428  | 2.3783          | 0.0372 | 0.0971 | 0.0242 | 0.0055    | 0.0273     | 0.0397    | 0.0551 | 0.1154 | 0.1375  | 0.0482    | 0.1097     | 0.1456    | 0.1679       | 0.3941           | 0.0             | 0.0                 | 0.0052     | 0.1491         | 0.0         | 0.0             | 0.013    | 0.1444       |
| 2.1391        | 5.0   | 535  | 2.1009          | 0.0431 | 0.1221 | 0.0279 | 0.0132    | 0.0452     | 0.0558    | 0.0776 | 0.1594 | 0.1825  | 0.05      | 0.1375     | 0.2131    | 0.1314       | 0.5027           | 0.0035          | 0.0114              | 0.0287     | 0.1929         | 0.0         | 0.0             | 0.0519   | 0.2053       |
| 2.1391        | 6.0   | 642  | 1.9217          | 0.0747 | 0.1698 | 0.0542 | 0.016     | 0.0705     | 0.0871    | 0.0991 | 0.2115 | 0.2292  | 0.0837    | 0.1963     | 0.2613    | 0.2409       | 0.5486           | 0.0157          | 0.0759              | 0.0267     | 0.2272         | 0.0         | 0.0             | 0.0902   | 0.2942       |
| 2.1391        | 7.0   | 749  | 1.8120          | 0.0979 | 0.2292 | 0.0721 | 0.0459    | 0.0843     | 0.1224    | 0.1189 | 0.2312 | 0.2519  | 0.1029    | 0.185      | 0.305     | 0.2764       | 0.6072           | 0.0317          | 0.1329              | 0.0573     | 0.2473         | 0.0006      | 0.0123          | 0.1235   | 0.26         |
| 2.1391        | 8.0   | 856  | 1.7772          | 0.1143 | 0.2514 | 0.0859 | 0.0542    | 0.1032     | 0.1291    | 0.1354 | 0.2623 | 0.2802  | 0.1133    | 0.2414     | 0.3081    | 0.3326       | 0.586            | 0.0228          | 0.1962              | 0.0537     | 0.2621         | 0.0047      | 0.0569          | 0.1579   | 0.3          |
| 2.1391        | 9.0   | 963  | 1.8144          | 0.113  | 0.241  | 0.0953 | 0.033     | 0.1015     | 0.1548    | 0.1359 | 0.2526 | 0.2747  | 0.085     | 0.2129     | 0.3444    | 0.3265       | 0.6144           | 0.0247          | 0.1266              | 0.0667     | 0.2812         | 0.0054      | 0.0523          | 0.1417   | 0.2991       |
| 1.6166        | 10.0  | 1070 | 1.7206          | 0.1323 | 0.3038 | 0.1002 | 0.0364    | 0.1282     | 0.1583    | 0.1603 | 0.3134 | 0.3358  | 0.1358    | 0.2938     | 0.4037    | 0.3642       | 0.5856           | 0.0707          | 0.3253              | 0.0737     | 0.3366         | 0.0116      | 0.1677          | 0.1414   | 0.264        |
| 1.6166        | 11.0  | 1177 | 1.7083          | 0.1276 | 0.2896 | 0.0967 | 0.0388    | 0.0993     | 0.1729    | 0.1609 | 0.2963 | 0.3151  | 0.1067    | 0.2524     | 0.4274    | 0.3697       | 0.595            | 0.0286          | 0.2291              | 0.068      | 0.2879         | 0.0199      | 0.1754          | 0.1518   | 0.288        |
| 1.6166        | 12.0  | 1284 | 1.6834          | 0.1449 | 0.3148 | 0.1108 | 0.0571    | 0.1279     | 0.1974    | 0.1737 | 0.3345 | 0.3563  | 0.1148    | 0.3097     | 0.4659    | 0.3718       | 0.6117           | 0.0711          | 0.3316              | 0.0794     | 0.3254         | 0.0157      | 0.1954          | 0.1867   | 0.3173       |
| 1.6166        | 13.0  | 1391 | 1.6155          | 0.1529 | 0.3458 | 0.1175 | 0.0616    | 0.134      | 0.1933    | 0.1727 | 0.3415 | 0.3575  | 0.1559    | 0.3007     | 0.4476    | 0.3808       | 0.5901           | 0.0938          | 0.319               | 0.0757     | 0.3192         | 0.0223      | 0.24            | 0.1918   | 0.3191       |
| 1.6166        | 14.0  | 1498 | 1.4882          | 0.1804 | 0.4013 | 0.1455 | 0.0763    | 0.1659     | 0.2394    | 0.2082 | 0.3821 | 0.4055  | 0.1954    | 0.376      | 0.4893    | 0.4026       | 0.6351           | 0.1158          | 0.3367              | 0.0936     | 0.3625         | 0.0631      | 0.3062          | 0.227    | 0.3871       |
| 1.3831        | 15.0  | 1605 | 1.5569          | 0.1659 | 0.3549 | 0.1344 | 0.0649    | 0.1433     | 0.2197    | 0.1954 | 0.3651 | 0.3902  | 0.1782    | 0.3434     | 0.4844    | 0.4207       | 0.6329           | 0.0965          | 0.3608              | 0.099      | 0.3808         | 0.0267      | 0.2508          | 0.1865   | 0.3258       |
| 1.3831        | 16.0  | 1712 | 1.4920          | 0.1901 | 0.3959 | 0.1731 | 0.068     | 0.165      | 0.2637    | 0.2107 | 0.3818 | 0.404   | 0.1709    | 0.364      | 0.5193    | 0.4609       | 0.6532           | 0.1177          | 0.3278              | 0.1078     | 0.354          | 0.0381      | 0.3092          | 0.226    | 0.3756       |
| 1.3831        | 17.0  | 1819 | 1.4959          | 0.1778 | 0.3692 | 0.153  | 0.0729    | 0.153      | 0.2576    | 0.2103 | 0.3841 | 0.4024  | 0.166     | 0.344      | 0.5385    | 0.4304       | 0.6545           | 0.1027          | 0.3646              | 0.1167     | 0.3688         | 0.0433      | 0.2923          | 0.1958   | 0.332        |
| 1.3831        | 18.0  | 1926 | 1.4860          | 0.1773 | 0.3851 | 0.143  | 0.0803    | 0.1557     | 0.2503    | 0.2016 | 0.3869 | 0.408   | 0.178     | 0.3553     | 0.5357    | 0.4292       | 0.6541           | 0.1094          | 0.3544              | 0.1216     | 0.3759         | 0.0294      | 0.3323          | 0.1969   | 0.3236       |
| 1.199         | 19.0  | 2033 | 1.4450          | 0.1928 | 0.4098 | 0.1659 | 0.0795    | 0.1614     | 0.2684    | 0.2191 | 0.4086 | 0.4296  | 0.1775    | 0.3737     | 0.5657    | 0.4464       | 0.6568           | 0.1216          | 0.4177              | 0.1339     | 0.3879         | 0.0385      | 0.3338          | 0.2238   | 0.3516       |
| 1.199         | 20.0  | 2140 | 1.4370          | 0.2055 | 0.4167 | 0.1871 | 0.0939    | 0.1832     | 0.2768    | 0.2299 | 0.4268 | 0.4501  | 0.2211    | 0.3888     | 0.584     | 0.4478       | 0.6414           | 0.1407          | 0.4418              | 0.148      | 0.4187         | 0.044       | 0.3692          | 0.247    | 0.3791       |
| 1.199         | 21.0  | 2247 | 1.4372          | 0.2137 | 0.438  | 0.1795 | 0.0881    | 0.1812     | 0.3067    | 0.2359 | 0.4332 | 0.4528  | 0.2222    | 0.3924     | 0.5957    | 0.4607       | 0.6523           | 0.152           | 0.4025              | 0.1584     | 0.4232         | 0.0612      | 0.4031          | 0.2361   | 0.3831       |
| 1.199         | 22.0  | 2354 | 1.4418          | 0.2104 | 0.4147 | 0.2017 | 0.0772    | 0.1752     | 0.3074    | 0.2405 | 0.4256 | 0.4414  | 0.1752    | 0.3699     | 0.6025    | 0.476        | 0.6581           | 0.1244          | 0.3911              | 0.1474     | 0.4201         | 0.0573      | 0.3554          | 0.2467   | 0.3822       |
| 1.199         | 23.0  | 2461 | 1.4337          | 0.2095 | 0.425  | 0.1827 | 0.0662    | 0.1827     | 0.3063    | 0.2347 | 0.4216 | 0.4426  | 0.1679    | 0.381      | 0.5999    | 0.4662       | 0.659            | 0.1335          | 0.4025              | 0.1548     | 0.4156         | 0.0569      | 0.3662          | 0.236    | 0.3698       |
| 1.0916        | 24.0  | 2568 | 1.3970          | 0.2184 | 0.4362 | 0.1992 | 0.0847    | 0.1814     | 0.3202    | 0.2391 | 0.4398 | 0.4564  | 0.1974    | 0.3876     | 0.6128    | 0.4789       | 0.6676           | 0.1366          | 0.4089              | 0.1601     | 0.442          | 0.0626      | 0.3769          | 0.2536   | 0.3867       |
| 1.0916        | 25.0  | 2675 | 1.4135          | 0.2198 | 0.4379 | 0.1938 | 0.0723    | 0.1927     | 0.3118    | 0.2453 | 0.4333 | 0.4531  | 0.1876    | 0.394      | 0.5976    | 0.4783       | 0.6572           | 0.1327          | 0.4038              | 0.1631     | 0.4393         | 0.0628      | 0.3708          | 0.2622   | 0.3947       |
| 1.0916        | 26.0  | 2782 | 1.4002          | 0.2197 | 0.4414 | 0.1969 | 0.0884    | 0.1883     | 0.314     | 0.2477 | 0.4374 | 0.4576  | 0.192     | 0.4035     | 0.6002    | 0.4784       | 0.6595           | 0.1312          | 0.4076              | 0.1609     | 0.4366         | 0.0677      | 0.3908          | 0.2605   | 0.3938       |
| 1.0916        | 27.0  | 2889 | 1.4037          | 0.218  | 0.4408 | 0.1984 | 0.0801    | 0.1856     | 0.3174    | 0.2457 | 0.4373 | 0.4545  | 0.1792    | 0.406      | 0.5993    | 0.4792       | 0.6568           | 0.1307          | 0.4152              | 0.1645     | 0.4366         | 0.0593      | 0.3708          | 0.2564   | 0.3933       |
| 1.0916        | 28.0  | 2996 | 1.4060          | 0.2185 | 0.4402 | 0.2012 | 0.0804    | 0.1864     | 0.3183    | 0.2495 | 0.4357 | 0.4535  | 0.1795    | 0.4038     | 0.5942    | 0.4799       | 0.6568           | 0.1307          | 0.4127              | 0.1662     | 0.4402         | 0.0609      | 0.3708          | 0.2549   | 0.3871       |
| 1.0159        | 29.0  | 3103 | 1.4032          | 0.2185 | 0.44   | 0.1995 | 0.0797    | 0.1867     | 0.3168    | 0.246  | 0.4375 | 0.454   | 0.1787    | 0.4065     | 0.5952    | 0.4795       | 0.6527           | 0.1328          | 0.419               | 0.1682     | 0.4411         | 0.0599      | 0.3708          | 0.2524   | 0.3867       |
| 1.0159        | 30.0  | 3210 | 1.4034          | 0.2188 | 0.4403 | 0.1997 | 0.08      | 0.1889     | 0.3172    | 0.2463 | 0.4367 | 0.4531  | 0.1789    | 0.4042     | 0.5944    | 0.4796       | 0.6545           | 0.1324          | 0.4152              | 0.1688     | 0.4415         | 0.0598      | 0.3662          | 0.2533   | 0.388        |


### Framework versions

- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0