swinv2-tiny-patch4-window8-256-dmae-humeda-DAV6
This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 1.0899
- Accuracy: 0.5962
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.65 | 1.0 | 12 | 1.5637 | 0.3462 |
1.5819 | 2.0 | 24 | 1.4962 | 0.3269 |
1.4527 | 3.0 | 36 | 1.3413 | 0.4615 |
1.2853 | 4.0 | 48 | 1.2034 | 0.4423 |
1.0185 | 5.0 | 60 | 1.1204 | 0.4808 |
0.8102 | 6.0 | 72 | 1.1004 | 0.4808 |
0.7477 | 7.0 | 84 | 1.1705 | 0.4808 |
0.6684 | 8.0 | 96 | 1.0245 | 0.5192 |
0.6297 | 9.0 | 108 | 1.0010 | 0.5577 |
0.5453 | 10.0 | 120 | 1.1905 | 0.4808 |
0.5496 | 11.0 | 132 | 1.1227 | 0.4808 |
0.4993 | 12.0 | 144 | 0.9619 | 0.5577 |
0.4297 | 13.0 | 156 | 1.0743 | 0.5192 |
0.4459 | 14.0 | 168 | 1.0195 | 0.5577 |
0.4219 | 15.0 | 180 | 1.0888 | 0.5 |
0.3742 | 16.0 | 192 | 1.0123 | 0.5769 |
0.4603 | 17.0 | 204 | 1.0503 | 0.5192 |
0.3607 | 18.0 | 216 | 1.1305 | 0.5577 |
0.3399 | 19.0 | 228 | 1.1327 | 0.5385 |
0.3422 | 20.0 | 240 | 1.1125 | 0.5192 |
0.3254 | 21.0 | 252 | 1.0243 | 0.5769 |
0.3363 | 22.0 | 264 | 1.0753 | 0.5577 |
0.3203 | 23.0 | 276 | 1.0778 | 0.5577 |
0.3248 | 24.0 | 288 | 1.1100 | 0.5385 |
0.2446 | 25.0 | 300 | 1.0773 | 0.5577 |
0.3058 | 26.0 | 312 | 1.0875 | 0.5769 |
0.254 | 27.0 | 324 | 1.0673 | 0.5769 |
0.2644 | 28.0 | 336 | 1.1026 | 0.5769 |
0.2962 | 29.0 | 348 | 1.0899 | 0.5962 |
0.2579 | 30.0 | 360 | 1.0816 | 0.5962 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 5
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV6
Base model
microsoft/swinv2-tiny-patch4-window8-256