swinv2-tiny-patch4-window8-256-dmae-humeda-DAV7
This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.9241
- Accuracy: 0.6923
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 64
- eval_batch_size: 64
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 30
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
1.6286 | 1.0 | 12 | 1.5624 | 0.2692 |
1.5581 | 2.0 | 24 | 1.4706 | 0.5192 |
1.4708 | 3.0 | 36 | 1.2967 | 0.5385 |
1.3022 | 4.0 | 48 | 1.1556 | 0.4615 |
0.9735 | 5.0 | 60 | 1.0354 | 0.5385 |
0.7914 | 6.0 | 72 | 1.1014 | 0.4423 |
0.7376 | 7.0 | 84 | 1.1087 | 0.4808 |
0.692 | 8.0 | 96 | 0.9630 | 0.6346 |
0.6586 | 9.0 | 108 | 0.9429 | 0.6346 |
0.5799 | 10.0 | 120 | 0.9332 | 0.6346 |
0.5557 | 11.0 | 132 | 1.1712 | 0.5192 |
0.5233 | 12.0 | 144 | 1.0447 | 0.5577 |
0.4427 | 13.0 | 156 | 0.8928 | 0.6538 |
0.5043 | 14.0 | 168 | 1.0120 | 0.5962 |
0.4167 | 15.0 | 180 | 0.9241 | 0.6923 |
0.4601 | 16.0 | 192 | 0.8848 | 0.6538 |
0.4619 | 17.0 | 204 | 0.9239 | 0.6923 |
0.3822 | 18.0 | 216 | 0.9208 | 0.6731 |
0.3707 | 19.0 | 228 | 1.0374 | 0.5769 |
0.365 | 20.0 | 240 | 0.9900 | 0.6538 |
0.3412 | 21.0 | 252 | 1.0541 | 0.6538 |
0.3265 | 22.0 | 264 | 0.9913 | 0.6731 |
0.3096 | 23.0 | 276 | 1.0355 | 0.6346 |
0.3603 | 24.0 | 288 | 0.9986 | 0.6538 |
0.2924 | 25.0 | 300 | 1.0046 | 0.6731 |
0.3489 | 26.0 | 312 | 1.0560 | 0.6346 |
0.2974 | 27.0 | 324 | 1.0076 | 0.6538 |
0.2924 | 28.0 | 336 | 1.0164 | 0.6538 |
0.3369 | 29.0 | 348 | 1.0260 | 0.6346 |
0.2884 | 30.0 | 360 | 1.0293 | 0.6346 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 4
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV7
Base model
microsoft/swinv2-tiny-patch4-window8-256