File size: 12,667 Bytes
c288a46
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9950aec
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c288a46
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
9950aec
c288a46
 
 
 
 
 
 
9950aec
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c288a46
 
 
 
9950aec
 
 
c288a46
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
---
library_name: transformers
license: apache-2.0
base_model: facebook/detr-resnet-101
tags:
- generated_from_trainer
model-index:
- name: detr_finetuned_cppe5
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# detr_finetuned_cppe5

This model is a fine-tuned version of [facebook/detr-resnet-101](https://huggingface.co/facebook/detr-resnet-101) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.3826
- Map: 0.2211
- Map 50: 0.4433
- Map 75: 0.1865
- Map Small: 0.085
- Map Medium: 0.1852
- Map Large: 0.3102
- Mar 1: 0.2358
- Mar 10: 0.4196
- Mar 100: 0.4395
- Mar Small: 0.1924
- Mar Medium: 0.3868
- Mar Large: 0.5751
- Map Coverall: 0.52
- Mar 100 Coverall: 0.677
- Map Face Shield: 0.1426
- Mar 100 Face Shield: 0.3924
- Map Gloves: 0.1344
- Mar 100 Gloves: 0.4192
- Map Goggles: 0.0746
- Mar 100 Goggles: 0.3277
- Map Mask: 0.2339
- Mar 100 Mask: 0.3813

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- num_epochs: 30

### Training results

| Training Loss | Epoch | Step | Validation Loss | Map    | Map 50 | Map 75 | Map Small | Map Medium | Map Large | Mar 1  | Mar 10 | Mar 100 | Mar Small | Mar Medium | Mar Large | Map Coverall | Mar 100 Coverall | Map Face Shield | Mar 100 Face Shield | Map Gloves | Mar 100 Gloves | Map Goggles | Mar 100 Goggles | Map Mask | Mar 100 Mask |
|:-------------:|:-----:|:----:|:---------------:|:------:|:------:|:------:|:---------:|:----------:|:---------:|:------:|:------:|:-------:|:---------:|:----------:|:---------:|:------------:|:----------------:|:---------------:|:-------------------:|:----------:|:--------------:|:-----------:|:---------------:|:--------:|:------------:|
| No log        | 1.0   | 107  | 2.4478          | 0.0132 | 0.0346 | 0.0085 | 0.0053    | 0.0142     | 0.013     | 0.0439 | 0.1138 | 0.1523  | 0.0478    | 0.1121     | 0.1613    | 0.0558       | 0.486            | 0.0             | 0.0                 | 0.0029     | 0.1638         | 0.0         | 0.0             | 0.0073   | 0.1116       |
| No log        | 2.0   | 214  | 2.7727          | 0.009  | 0.028  | 0.0041 | 0.0058    | 0.0185     | 0.0072    | 0.0337 | 0.0875 | 0.1368  | 0.0323    | 0.1066     | 0.1435    | 0.0264       | 0.4171           | 0.0             | 0.0                 | 0.0113     | 0.1304         | 0.0         | 0.0             | 0.0073   | 0.1364       |
| No log        | 3.0   | 321  | 2.6495          | 0.0134 | 0.044  | 0.0056 | 0.0068    | 0.0143     | 0.0141    | 0.0294 | 0.1063 | 0.1269  | 0.0467    | 0.1156     | 0.1424    | 0.0355       | 0.3099           | 0.0             | 0.0                 | 0.0126     | 0.1464         | 0.0         | 0.0             | 0.019    | 0.1782       |
| No log        | 4.0   | 428  | 2.1866          | 0.0354 | 0.0933 | 0.0234 | 0.0103    | 0.0208     | 0.0367    | 0.0535 | 0.1521 | 0.1742  | 0.0574    | 0.1331     | 0.1925    | 0.1323       | 0.4878           | 0.0             | 0.0                 | 0.015      | 0.1871         | 0.0         | 0.0             | 0.0296   | 0.196        |
| 2.1646        | 5.0   | 535  | 1.9950          | 0.0496 | 0.1249 | 0.0335 | 0.0092    | 0.0338     | 0.057     | 0.0665 | 0.1721 | 0.2044  | 0.0729    | 0.1567     | 0.2354    | 0.1684       | 0.5369           | 0.0025          | 0.0114              | 0.0244     | 0.2201         | 0.0         | 0.0             | 0.0527   | 0.2533       |
| 2.1646        | 6.0   | 642  | 2.0231          | 0.0467 | 0.1218 | 0.0301 | 0.0146    | 0.0459     | 0.0611    | 0.0725 | 0.1786 | 0.2064  | 0.078     | 0.1636     | 0.2389    | 0.1419       | 0.5495           | 0.0018          | 0.0342              | 0.0194     | 0.2076         | 0.0         | 0.0             | 0.0705   | 0.2404       |
| 2.1646        | 7.0   | 749  | 1.9400          | 0.0551 | 0.1336 | 0.04   | 0.022     | 0.0467     | 0.0727    | 0.0946 | 0.1933 | 0.2233  | 0.1013    | 0.1747     | 0.2636    | 0.1717       | 0.5207           | 0.0075          | 0.0823              | 0.0229     | 0.2455         | 0.0059      | 0.0046          | 0.0673   | 0.2636       |
| 2.1646        | 8.0   | 856  | 1.9149          | 0.07   | 0.1803 | 0.0523 | 0.037     | 0.0521     | 0.0903    | 0.0986 | 0.1937 | 0.223   | 0.0892    | 0.1663     | 0.2716    | 0.2104       | 0.5694           | 0.0096          | 0.062               | 0.03       | 0.229          | 0.0015      | 0.0077          | 0.0987   | 0.2467       |
| 2.1646        | 9.0   | 963  | 1.8614          | 0.0893 | 0.2058 | 0.0651 | 0.0248    | 0.0795     | 0.1176    | 0.1088 | 0.2341 | 0.2559  | 0.1075    | 0.2323     | 0.2872    | 0.259        | 0.5509           | 0.0201          | 0.1684              | 0.0383     | 0.2545         | 0.0006      | 0.0108          | 0.1285   | 0.2951       |
| 1.7395        | 10.0  | 1070 | 1.7835          | 0.1106 | 0.2638 | 0.0788 | 0.0315    | 0.1011     | 0.1466    | 0.1414 | 0.2696 | 0.297   | 0.1256    | 0.266      | 0.3475    | 0.2673       | 0.5446           | 0.0612          | 0.2494              | 0.0443     | 0.2728         | 0.015       | 0.0969          | 0.1653   | 0.3213       |
| 1.7395        | 11.0  | 1177 | 1.7088          | 0.1197 | 0.2811 | 0.0839 | 0.064     | 0.1103     | 0.1504    | 0.1524 | 0.2979 | 0.316   | 0.1231    | 0.2602     | 0.4093    | 0.314        | 0.605            | 0.0536          | 0.2557              | 0.0512     | 0.2719         | 0.0192      | 0.16            | 0.1608   | 0.2876       |
| 1.7395        | 12.0  | 1284 | 1.6845          | 0.1454 | 0.3145 | 0.1206 | 0.0385    | 0.1227     | 0.195     | 0.178  | 0.3173 | 0.3421  | 0.1181    | 0.294      | 0.443     | 0.3877       | 0.6401           | 0.0626          | 0.2772              | 0.0753     | 0.3272         | 0.0239      | 0.14            | 0.1772   | 0.3258       |
| 1.7395        | 13.0  | 1391 | 1.6600          | 0.1447 | 0.3095 | 0.1262 | 0.0529    | 0.1245     | 0.199     | 0.1701 | 0.3301 | 0.3528  | 0.11      | 0.3093     | 0.4715    | 0.4091       | 0.6437           | 0.0627          | 0.2949              | 0.0596     | 0.3259         | 0.0181      | 0.1815          | 0.1741   | 0.3178       |
| 1.7395        | 14.0  | 1498 | 1.5611          | 0.158  | 0.3529 | 0.1249 | 0.0614    | 0.1434     | 0.2098    | 0.1968 | 0.3545 | 0.3776  | 0.14      | 0.3364     | 0.4679    | 0.3879       | 0.6266           | 0.1051          | 0.3519              | 0.0666     | 0.346          | 0.039       | 0.2231          | 0.1915   | 0.3404       |
| 1.4537        | 15.0  | 1605 | 1.6226          | 0.1779 | 0.3643 | 0.1587 | 0.0609    | 0.1449     | 0.2662    | 0.2166 | 0.3661 | 0.3837  | 0.1291    | 0.3141     | 0.5235    | 0.4187       | 0.6302           | 0.0943          | 0.3519              | 0.0902     | 0.3384         | 0.0598      | 0.2385          | 0.2264   | 0.3596       |
| 1.4537        | 16.0  | 1712 | 1.5840          | 0.1641 | 0.3602 | 0.1287 | 0.0592    | 0.1294     | 0.2399    | 0.2036 | 0.3643 | 0.3828  | 0.1486    | 0.3242     | 0.4985    | 0.3993       | 0.6059           | 0.1291          | 0.3722              | 0.0768     | 0.35           | 0.0347      | 0.2631          | 0.1805   | 0.3227       |
| 1.4537        | 17.0  | 1819 | 1.4955          | 0.1812 | 0.3855 | 0.1458 | 0.0679    | 0.1367     | 0.27      | 0.2093 | 0.3732 | 0.3949  | 0.1526    | 0.3293     | 0.5173    | 0.4528       | 0.6568           | 0.1031          | 0.3468              | 0.0958     | 0.3754         | 0.051       | 0.2385          | 0.2033   | 0.3569       |
| 1.4537        | 18.0  | 1926 | 1.4729          | 0.1899 | 0.4035 | 0.1552 | 0.066     | 0.1613     | 0.2749    | 0.2106 | 0.3969 | 0.4252  | 0.1691    | 0.3685     | 0.557     | 0.4587       | 0.6775           | 0.1148          | 0.3911              | 0.1048     | 0.3982         | 0.0525      | 0.2938          | 0.2187   | 0.3653       |
| 1.2794        | 19.0  | 2033 | 1.4837          | 0.2061 | 0.423  | 0.1807 | 0.0724    | 0.1669     | 0.2828    | 0.226  | 0.4066 | 0.4308  | 0.1742    | 0.3779     | 0.5417    | 0.4828       | 0.6698           | 0.1354          | 0.3886              | 0.1249     | 0.3853         | 0.0581      | 0.3523          | 0.2291   | 0.3582       |
| 1.2794        | 20.0  | 2140 | 1.4320          | 0.2055 | 0.4205 | 0.1675 | 0.0796    | 0.1685     | 0.2868    | 0.225  | 0.4086 | 0.4311  | 0.1831    | 0.3727     | 0.5552    | 0.4733       | 0.6545           | 0.1472          | 0.4051              | 0.1114     | 0.3969         | 0.0687      | 0.3338          | 0.2268   | 0.3653       |
| 1.2794        | 21.0  | 2247 | 1.3978          | 0.2094 | 0.4321 | 0.1701 | 0.0773    | 0.1815     | 0.2968    | 0.2275 | 0.4136 | 0.436   | 0.2046    | 0.3843     | 0.5531    | 0.4764       | 0.6599           | 0.1377          | 0.3759              | 0.1259     | 0.4321         | 0.0765      | 0.3385          | 0.2307   | 0.3733       |
| 1.2794        | 22.0  | 2354 | 1.3970          | 0.2025 | 0.4224 | 0.1652 | 0.0741    | 0.1656     | 0.2877    | 0.2254 | 0.4152 | 0.4381  | 0.2127    | 0.3905     | 0.5433    | 0.4904       | 0.6608           | 0.1122          | 0.4051              | 0.1111     | 0.4107         | 0.06        | 0.3369          | 0.2389   | 0.3769       |
| 1.2794        | 23.0  | 2461 | 1.4207          | 0.2095 | 0.4378 | 0.1769 | 0.0663    | 0.1792     | 0.3028    | 0.233  | 0.4145 | 0.4326  | 0.1553    | 0.3903     | 0.5597    | 0.4918       | 0.6752           | 0.1181          | 0.3911              | 0.1302     | 0.4085         | 0.0771      | 0.3108          | 0.2304   | 0.3773       |
| 1.1354        | 24.0  | 2568 | 1.3942          | 0.214  | 0.4383 | 0.1754 | 0.0763    | 0.1737     | 0.3056    | 0.2244 | 0.4212 | 0.4427  | 0.1957    | 0.3859     | 0.5632    | 0.5148       | 0.6847           | 0.1354          | 0.4025              | 0.129      | 0.4192         | 0.0722      | 0.3323          | 0.2185   | 0.3747       |
| 1.1354        | 25.0  | 2675 | 1.3834          | 0.214  | 0.4377 | 0.1816 | 0.0783    | 0.182      | 0.3042    | 0.2307 | 0.4175 | 0.4352  | 0.1911    | 0.3845     | 0.5586    | 0.5116       | 0.6806           | 0.1269          | 0.3924              | 0.1278     | 0.4049         | 0.0702      | 0.3185          | 0.2336   | 0.3796       |
| 1.1354        | 26.0  | 2782 | 1.3870          | 0.2169 | 0.4399 | 0.1832 | 0.0788    | 0.1837     | 0.3056    | 0.2386 | 0.418  | 0.4372  | 0.1865    | 0.3921     | 0.5633    | 0.5093       | 0.6743           | 0.1476          | 0.3962              | 0.1273     | 0.4054         | 0.0698      | 0.3308          | 0.2303   | 0.3796       |
| 1.1354        | 27.0  | 2889 | 1.3859          | 0.219  | 0.4387 | 0.1816 | 0.0818    | 0.1826     | 0.3069    | 0.2352 | 0.4177 | 0.4376  | 0.1808    | 0.3857     | 0.5744    | 0.5118       | 0.6707           | 0.1416          | 0.3861              | 0.1355     | 0.4237         | 0.0727      | 0.3246          | 0.2336   | 0.3831       |
| 1.1354        | 28.0  | 2996 | 1.3814          | 0.2218 | 0.445  | 0.1878 | 0.0827    | 0.1873     | 0.311     | 0.2385 | 0.4208 | 0.44    | 0.1915    | 0.3871     | 0.5735    | 0.5222       | 0.677            | 0.1423          | 0.3962              | 0.1352     | 0.4174         | 0.0744      | 0.3262          | 0.2349   | 0.3831       |
| 1.0569        | 29.0  | 3103 | 1.3851          | 0.2213 | 0.4429 | 0.1867 | 0.0848    | 0.1858     | 0.3108    | 0.2356 | 0.4204 | 0.4396  | 0.1912    | 0.3876     | 0.5737    | 0.5193       | 0.6761           | 0.1433          | 0.3962              | 0.1343     | 0.4174         | 0.0744      | 0.3262          | 0.235    | 0.3822       |
| 1.0569        | 30.0  | 3210 | 1.3826          | 0.2211 | 0.4433 | 0.1865 | 0.085     | 0.1852     | 0.3102    | 0.2358 | 0.4196 | 0.4395  | 0.1924    | 0.3868     | 0.5751    | 0.52         | 0.677            | 0.1426          | 0.3924              | 0.1344     | 0.4192         | 0.0746      | 0.3277          | 0.2339   | 0.3813       |


### Framework versions

- Transformers 4.48.3
- Pytorch 2.5.1+cu124
- Datasets 3.3.1
- Tokenizers 0.21.0