File size: 6,546 Bytes
6358cc3
 
 
 
 
 
 
 
 
 
 
 
 
 
f97995a
6358cc3
f97995a
 
 
 
 
 
 
 
6358cc3
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
f97995a
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6358cc3
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
---
license: other
tags:
- generated_from_trainer
model-index:
- name: segformer-b0-DeepCrack
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# segformer-b0-DeepCrack

This model is a fine-tuned version of [nvidia/mit-b0](https://huggingface.co/nvidia/mit-b0) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3347
- Mean Iou: 0.6839
- Mean Accuracy: 0.7408
- Overall Accuracy: 0.9681
- Accuracy Background: 0.9897
- Accuracy Crack: 0.4918
- Iou Background: 0.9674
- Iou Crack: 0.4003

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 6e-05
- train_batch_size: 2
- eval_batch_size: 2
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 1

### Training results

| Training Loss | Epoch | Step | Validation Loss | Mean Iou | Mean Accuracy | Overall Accuracy | Accuracy Background | Accuracy Crack | Iou Background | Iou Crack |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:-------------:|:----------------:|:-------------------:|:--------------:|:--------------:|:---------:|
| 0.8203        | 0.03  | 5    | 0.6973          | 0.3317   | 0.7410        | 0.5924           | 0.5783              | 0.9037         | 0.5758         | 0.0876    |
| 0.7469        | 0.07  | 10   | 0.6930          | 0.3533   | 0.7185        | 0.6325           | 0.6244              | 0.8125         | 0.6192         | 0.0873    |
| 0.7324        | 0.1   | 15   | 0.6884          | 0.3545   | 0.6605        | 0.6436           | 0.6421              | 0.6788         | 0.6329         | 0.0762    |
| 0.7079        | 0.13  | 20   | 0.6910          | 0.2537   | 0.5518        | 0.4726           | 0.4650              | 0.6386         | 0.4576         | 0.0498    |
| 0.6472        | 0.17  | 25   | 0.6831          | 0.2972   | 0.5734        | 0.5519           | 0.5498              | 0.5969         | 0.5400         | 0.0545    |
| 0.6344        | 0.2   | 30   | 0.6630          | 0.4652   | 0.7477        | 0.8045           | 0.8099              | 0.6854         | 0.7985         | 0.1318    |
| 0.6264        | 0.23  | 35   | 0.6389          | 0.5567   | 0.7850        | 0.8977           | 0.9084              | 0.6617         | 0.8947         | 0.2187    |
| 0.5811        | 0.27  | 40   | 0.6087          | 0.6070   | 0.8069        | 0.9279           | 0.9394              | 0.6745         | 0.9257         | 0.2882    |
| 0.5928        | 0.3   | 45   | 0.5584          | 0.6469   | 0.7851        | 0.9503           | 0.9660              | 0.6042         | 0.9490         | 0.3448    |
| 0.5312        | 0.33  | 50   | 0.5476          | 0.6508   | 0.7789        | 0.9527           | 0.9692              | 0.5886         | 0.9515         | 0.3502    |
| 0.5209        | 0.37  | 55   | 0.5423          | 0.6561   | 0.7665        | 0.9564           | 0.9744              | 0.5586         | 0.9553         | 0.3568    |
| 0.4675        | 0.4   | 60   | 0.5332          | 0.6470   | 0.7529        | 0.9553           | 0.9745              | 0.5313         | 0.9543         | 0.3397    |
| 0.4831        | 0.43  | 65   | 0.4772          | 0.6746   | 0.7502        | 0.9644           | 0.9847              | 0.5157         | 0.9636         | 0.3855    |
| 0.4512        | 0.47  | 70   | 0.4624          | 0.6734   | 0.7830        | 0.9598           | 0.9765              | 0.5895         | 0.9587         | 0.3881    |
| 0.426         | 0.5   | 75   | 0.4589          | 0.6688   | 0.7912        | 0.9572           | 0.9730              | 0.6094         | 0.9561         | 0.3815    |
| 0.4147        | 0.53  | 80   | 0.4529          | 0.6769   | 0.7846        | 0.9606           | 0.9773              | 0.5918         | 0.9596         | 0.3942    |
| 0.4144        | 0.57  | 85   | 0.4160          | 0.6767   | 0.7616        | 0.9635           | 0.9827              | 0.5405         | 0.9627         | 0.3908    |
| 0.4192        | 0.6   | 90   | 0.3747          | 0.6612   | 0.7271        | 0.9639           | 0.9863              | 0.4680         | 0.9631         | 0.3593    |
| 0.4294        | 0.63  | 95   | 0.3649          | 0.6495   | 0.7064        | 0.9637           | 0.9880              | 0.4247         | 0.9630         | 0.3359    |
| 0.3609        | 0.67  | 100  | 0.3730          | 0.6480   | 0.7003        | 0.9642           | 0.9893              | 0.4113         | 0.9636         | 0.3324    |
| 0.3782        | 0.7   | 105  | 0.3699          | 0.6584   | 0.7229        | 0.9637           | 0.9865              | 0.4592         | 0.9630         | 0.3538    |
| 0.3594        | 0.73  | 110  | 0.3505          | 0.6638   | 0.7161        | 0.9662           | 0.9899              | 0.4423         | 0.9656         | 0.3619    |
| 0.3966        | 0.77  | 115  | 0.3474          | 0.6720   | 0.7263        | 0.9670           | 0.9898              | 0.4627         | 0.9663         | 0.3776    |
| 0.3365        | 0.8   | 120  | 0.3598          | 0.6710   | 0.7185        | 0.9678           | 0.9915              | 0.4456         | 0.9672         | 0.3748    |
| 0.3497        | 0.83  | 125  | 0.3530          | 0.6752   | 0.7161        | 0.9692           | 0.9932              | 0.4389         | 0.9686         | 0.3817    |
| 0.3303        | 0.87  | 130  | 0.3424          | 0.6792   | 0.7247        | 0.9690           | 0.9922              | 0.4572         | 0.9684         | 0.3899    |
| 0.3702        | 0.9   | 135  | 0.3379          | 0.6823   | 0.7341        | 0.9686           | 0.9908              | 0.4774         | 0.9679         | 0.3967    |
| 0.3199        | 0.93  | 140  | 0.3317          | 0.6858   | 0.7468        | 0.9678           | 0.9888              | 0.5048         | 0.9671         | 0.4044    |
| 0.304         | 0.97  | 145  | 0.3189          | 0.6854   | 0.7408        | 0.9685           | 0.9900              | 0.4916         | 0.9678         | 0.4030    |
| 0.3392        | 1.0   | 150  | 0.3347          | 0.6839   | 0.7408        | 0.9681           | 0.9897              | 0.4918         | 0.9674         | 0.4003    |


### Framework versions

- Transformers 4.30.2
- Pytorch 2.0.1+cu118
- Datasets 2.13.1
- Tokenizers 0.13.3