smids_10x_deit_small_rms_001_fold2
This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:
- Loss: 0.7042
- Accuracy: 0.8369
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.001
- train_batch_size: 32
- eval_batch_size: 32
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 50
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy |
---|---|---|---|---|
0.8864 | 1.0 | 750 | 0.8248 | 0.5474 |
0.7722 | 2.0 | 1500 | 0.8988 | 0.5092 |
0.7721 | 3.0 | 2250 | 0.7604 | 0.6456 |
0.7086 | 4.0 | 3000 | 0.6560 | 0.7371 |
0.6588 | 5.0 | 3750 | 0.6906 | 0.7088 |
0.5657 | 6.0 | 4500 | 0.5964 | 0.7654 |
0.5826 | 7.0 | 5250 | 0.5186 | 0.7854 |
0.5637 | 8.0 | 6000 | 0.5513 | 0.7737 |
0.5395 | 9.0 | 6750 | 0.5704 | 0.7537 |
0.5342 | 10.0 | 7500 | 0.4931 | 0.7987 |
0.5349 | 11.0 | 8250 | 0.5109 | 0.7937 |
0.596 | 12.0 | 9000 | 0.5425 | 0.7804 |
0.5878 | 13.0 | 9750 | 0.4766 | 0.8103 |
0.4609 | 14.0 | 10500 | 0.7520 | 0.7022 |
0.4491 | 15.0 | 11250 | 0.5442 | 0.7504 |
0.4637 | 16.0 | 12000 | 0.5054 | 0.8136 |
0.4699 | 17.0 | 12750 | 0.4927 | 0.8037 |
0.4528 | 18.0 | 13500 | 0.4576 | 0.8120 |
0.4797 | 19.0 | 14250 | 0.4748 | 0.7970 |
0.4704 | 20.0 | 15000 | 0.4438 | 0.8070 |
0.4406 | 21.0 | 15750 | 0.4383 | 0.8153 |
0.4289 | 22.0 | 16500 | 0.4522 | 0.8120 |
0.4219 | 23.0 | 17250 | 0.4457 | 0.8286 |
0.3979 | 24.0 | 18000 | 0.4791 | 0.8203 |
0.476 | 25.0 | 18750 | 0.4867 | 0.8136 |
0.4039 | 26.0 | 19500 | 0.4638 | 0.8319 |
0.4302 | 27.0 | 20250 | 0.4222 | 0.8303 |
0.4091 | 28.0 | 21000 | 0.4516 | 0.8270 |
0.3603 | 29.0 | 21750 | 0.5085 | 0.8170 |
0.4414 | 30.0 | 22500 | 0.4568 | 0.8353 |
0.3768 | 31.0 | 23250 | 0.4984 | 0.8253 |
0.3126 | 32.0 | 24000 | 0.4428 | 0.8436 |
0.3269 | 33.0 | 24750 | 0.4871 | 0.8236 |
0.3283 | 34.0 | 25500 | 0.4708 | 0.8253 |
0.3471 | 35.0 | 26250 | 0.4869 | 0.8353 |
0.3619 | 36.0 | 27000 | 0.5210 | 0.8153 |
0.4176 | 37.0 | 27750 | 0.4744 | 0.8353 |
0.3395 | 38.0 | 28500 | 0.5334 | 0.8386 |
0.2458 | 39.0 | 29250 | 0.5218 | 0.8286 |
0.3331 | 40.0 | 30000 | 0.5874 | 0.8186 |
0.3063 | 41.0 | 30750 | 0.5488 | 0.8236 |
0.2956 | 42.0 | 31500 | 0.5739 | 0.8220 |
0.3105 | 43.0 | 32250 | 0.5441 | 0.8369 |
0.2918 | 44.0 | 33000 | 0.6039 | 0.8303 |
0.2418 | 45.0 | 33750 | 0.6214 | 0.8303 |
0.2859 | 46.0 | 34500 | 0.6601 | 0.8286 |
0.2507 | 47.0 | 35250 | 0.6435 | 0.8369 |
0.2443 | 48.0 | 36000 | 0.6789 | 0.8336 |
0.2825 | 49.0 | 36750 | 0.6931 | 0.8336 |
0.1845 | 50.0 | 37500 | 0.7042 | 0.8369 |
Framework versions
- Transformers 4.32.1
- Pytorch 2.1.0+cu121
- Datasets 2.12.0
- Tokenizers 0.13.2
- Downloads last month
- 5
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for hkivancoral/smids_10x_deit_small_rms_001_fold2
Base model
facebook/deit-small-patch16-224