swinv2-tiny-patch4-window8-256-dmae-humeda-DAV63

This model is a fine-tuned version of microsoft/swinv2-tiny-patch4-window8-256 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.2824
  • Accuracy: 0.9029

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 45
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 1.0 5 1.0434 0.4743
1.0715 2.0 10 1.0274 0.4571
1.0715 3.0 15 0.9232 0.5714
0.9224 4.0 20 0.7760 0.7257
0.9224 5.0 25 0.6111 0.7314
0.659 6.0 30 0.4303 0.8514
0.659 7.0 35 0.3456 0.8743
0.4725 8.0 40 0.4387 0.8
0.4725 9.0 45 0.3622 0.8286
0.4171 10.0 50 0.3593 0.8686
0.4171 11.0 55 0.3262 0.8629
0.3615 12.0 60 0.3256 0.8571
0.3615 13.0 65 0.3205 0.8743
0.3094 14.0 70 0.3051 0.8743
0.3094 15.0 75 0.2948 0.8857
0.2784 16.0 80 0.3237 0.8857
0.2784 17.0 85 0.2920 0.88
0.2849 18.0 90 0.2824 0.9029
0.2849 19.0 95 0.3579 0.8743
0.2772 20.0 100 0.2966 0.8857
0.2772 21.0 105 0.3484 0.8743
0.2135 22.0 110 0.3230 0.88
0.2135 23.0 115 0.3591 0.8686
0.2067 24.0 120 0.3292 0.8743
0.2067 25.0 125 0.3496 0.88
0.165 26.0 130 0.3628 0.88
0.165 27.0 135 0.3345 0.8914
0.1888 28.0 140 0.3596 0.88
0.1888 29.0 145 0.3567 0.8914
0.1473 30.0 150 0.3824 0.8743
0.1473 31.0 155 0.3685 0.88
0.1472 32.0 160 0.3488 0.8971
0.1472 33.0 165 0.3895 0.8857
0.1568 34.0 170 0.3929 0.8914
0.1568 35.0 175 0.3803 0.8857
0.129 36.0 180 0.3954 0.8743
0.129 37.0 185 0.3920 0.8743
0.1295 38.0 190 0.3945 0.8971
0.1295 39.0 195 0.3836 0.8857
0.1322 40.0 200 0.4102 0.8743
0.1322 41.0 205 0.4226 0.8686
0.1124 42.0 210 0.3939 0.88
0.1124 43.0 215 0.3911 0.8857
0.1162 44.0 220 0.3986 0.8914
0.1162 45.0 225 0.4015 0.8857

Framework versions

  • Transformers 4.48.3
  • Pytorch 2.6.0+cu124
  • Datasets 3.4.1
  • Tokenizers 0.21.1
Downloads last month
8
Safetensors
Model size
27.6M params
Tensor type
F32
ยท
Inference Providers NEW
This model isn't deployed by any Inference Provider. ๐Ÿ™‹ Ask for provider support

Model tree for RobertoSonic/swinv2-tiny-patch4-window8-256-dmae-humeda-DAV63

Finetuned
(137)
this model