swinv2-tiny-patch4-window8-256-dmae-humeda-DAV10

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.0099
  • Accuracy: 0.6731

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.8421 4 1.5825 0.2885
No log 1.8421 8 1.4864 0.5385
No log 2.8421 12 1.3187 0.5577
No log 3.8421 16 1.0843 0.6731
No log 4.8421 20 1.0044 0.5962
No log 5.8421 24 0.8914 0.6346
No log 6.8421 28 0.9119 0.5769
No log 7.8421 32 0.9379 0.6538
No log 8.8421 36 0.9471 0.6154
No log 9.8421 40 0.9565 0.6538
No log 10.8421 44 0.9581 0.6923
No log 11.8421 48 0.9655 0.6923
4.2193 12.8421 52 0.9880 0.7115
4.2193 13.8421 56 0.9557 0.6923
4.2193 14.8421 60 0.9275 0.6538
4.2193 15.8421 64 1.0216 0.5769
4.2193 16.8421 68 0.9646 0.6346
4.2193 17.8421 72 0.9957 0.6731
4.2193 18.8421 76 1.0366 0.6346
4.2193 19.8421 80 0.9978 0.6538
4.2193 20.8421 84 0.9941 0.6731
4.2193 21.8421 88 1.0063 0.6923
4.2193 22.8421 92 1.0114 0.6731
4.2193 23.8421 96 1.0134 0.6731
1.6367 24.8421 100 1.0097 0.6731
1.6367 25.8421 104 1.0084 0.6731
1.6367 26.8421 108 1.0097 0.6731
1.6367 27.8421 112 1.0102 0.6731
1.6367 28.8421 116 1.0100 0.6731
1.6367 29.8421 120 1.0099 0.6731

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
5
Safetensors
Model size
27.6M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV10

Finetuned
(137)
this model