smids_5x_deit_tiny_sgd_001_fold5

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2848
  • Accuracy: 0.8717

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.7235 1.0 375 0.7207 0.7267
0.5565 2.0 750 0.5409 0.7917
0.4119 3.0 1125 0.4652 0.825
0.4334 4.0 1500 0.4261 0.825
0.3887 5.0 1875 0.3995 0.8317
0.3453 6.0 2250 0.3796 0.8483
0.3126 7.0 2625 0.3666 0.8533
0.2904 8.0 3000 0.3555 0.8533
0.2918 9.0 3375 0.3426 0.8617
0.2742 10.0 3750 0.3359 0.8583
0.2643 11.0 4125 0.3302 0.865
0.2302 12.0 4500 0.3258 0.8617
0.2469 13.0 4875 0.3202 0.8633
0.2562 14.0 5250 0.3189 0.8683
0.29 15.0 5625 0.3116 0.87
0.1812 16.0 6000 0.3053 0.87
0.297 17.0 6375 0.3040 0.8733
0.2383 18.0 6750 0.3060 0.8733
0.2162 19.0 7125 0.3016 0.8683
0.1713 20.0 7500 0.3023 0.8717
0.1959 21.0 7875 0.2972 0.8767
0.1937 22.0 8250 0.2967 0.87
0.2843 23.0 8625 0.2932 0.8683
0.199 24.0 9000 0.2945 0.8717
0.2226 25.0 9375 0.2906 0.875
0.2091 26.0 9750 0.2867 0.8683
0.2068 27.0 10125 0.2936 0.8717
0.1892 28.0 10500 0.2828 0.8767
0.2214 29.0 10875 0.2884 0.8683
0.1748 30.0 11250 0.2926 0.8683
0.2363 31.0 11625 0.2864 0.8733
0.209 32.0 12000 0.2852 0.8733
0.2053 33.0 12375 0.2863 0.8683
0.2387 34.0 12750 0.2806 0.8767
0.1656 35.0 13125 0.2863 0.87
0.1491 36.0 13500 0.2870 0.8667
0.1628 37.0 13875 0.2841 0.87
0.1814 38.0 14250 0.2851 0.8683
0.1769 39.0 14625 0.2852 0.8667
0.1855 40.0 15000 0.2843 0.8717
0.185 41.0 15375 0.2824 0.8717
0.1482 42.0 15750 0.2829 0.8733
0.2112 43.0 16125 0.2835 0.875
0.2011 44.0 16500 0.2835 0.8733
0.1775 45.0 16875 0.2835 0.8717
0.1816 46.0 17250 0.2845 0.8717
0.1826 47.0 17625 0.2843 0.8717
0.1411 48.0 18000 0.2851 0.8683
0.1734 49.0 18375 0.2850 0.8717
0.1888 50.0 18750 0.2848 0.8717

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.1+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
4
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for hkivancoral/smids_5x_deit_tiny_sgd_001_fold5

Finetuned
(309)
this model

Evaluation results