smids_10x_deit_small_rms_001_fold4

This model is a fine-tuned version of facebook/deit-small-patch16-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 2.4181
  • Accuracy: 0.7883

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.001
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 50

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.808 1.0 750 0.7526 0.6333
0.7422 2.0 1500 0.7345 0.58
0.7146 3.0 2250 0.7176 0.6433
0.701 4.0 3000 0.6566 0.6933
0.6799 5.0 3750 0.5734 0.745
0.6007 6.0 4500 0.6368 0.72
0.5519 7.0 5250 0.5547 0.7833
0.6119 8.0 6000 0.5502 0.7667
0.5599 9.0 6750 0.5552 0.75
0.508 10.0 7500 0.5666 0.7383
0.5209 11.0 8250 0.5288 0.7683
0.6053 12.0 9000 0.5408 0.765
0.4938 13.0 9750 0.5449 0.755
0.5012 14.0 10500 0.6211 0.7533
0.4544 15.0 11250 0.6619 0.725
0.4855 16.0 12000 0.5101 0.8083
0.3525 17.0 12750 0.5278 0.7833
0.4312 18.0 13500 0.5051 0.7917
0.4593 19.0 14250 0.5277 0.7867
0.3939 20.0 15000 0.5517 0.7833
0.5185 21.0 15750 0.5418 0.7667
0.424 22.0 16500 0.5465 0.7917
0.3637 23.0 17250 0.5971 0.785
0.4457 24.0 18000 0.5681 0.7933
0.4023 25.0 18750 0.5160 0.805
0.3012 26.0 19500 0.5373 0.8283
0.2933 27.0 20250 0.5885 0.8067
0.3104 28.0 21000 0.6014 0.8017
0.292 29.0 21750 0.6093 0.8033
0.3506 30.0 22500 0.6800 0.7633
0.2777 31.0 23250 0.6858 0.795
0.2396 32.0 24000 0.6442 0.805
0.3316 33.0 24750 0.6440 0.81
0.2958 34.0 25500 0.6532 0.8117
0.2121 35.0 26250 0.7494 0.8083
0.1764 36.0 27000 0.7942 0.7933
0.1963 37.0 27750 0.7817 0.7883
0.1829 38.0 28500 0.8010 0.7917
0.1937 39.0 29250 0.8544 0.795
0.1493 40.0 30000 0.9520 0.7967
0.1419 41.0 30750 0.9695 0.81
0.1784 42.0 31500 1.0763 0.8017
0.1485 43.0 32250 1.1404 0.825
0.0665 44.0 33000 1.3155 0.7933
0.1004 45.0 33750 1.5689 0.7933
0.0917 46.0 34500 1.5920 0.7917
0.0556 47.0 35250 1.8022 0.7967
0.0427 48.0 36000 2.0148 0.8067
0.032 49.0 36750 2.3253 0.795
0.012 50.0 37500 2.4181 0.7883

Framework versions

  • Transformers 4.32.1
  • Pytorch 2.1.0+cu121
  • Datasets 2.12.0
  • Tokenizers 0.13.2
Downloads last month
9
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for hkivancoral/smids_10x_deit_small_rms_001_fold4

Finetuned
(309)
this model

Evaluation results