Melo1512's picture
End of training
21e72ad verified
metadata
library_name: transformers
license: apache-2.0
base_model: Melo1512/vit-msn-small-lateral_flow_ivalidation_train_test_7
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: vit-msn-small-lateral_flow_ivalidation_train_test_7
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: test
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8791208791208791

vit-msn-small-lateral_flow_ivalidation_train_test_7

This model is a fine-tuned version of Melo1512/vit-msn-small-lateral_flow_ivalidation_train_test_7 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4160
  • Accuracy: 0.8791

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 256
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.2
  • num_epochs: 100
  • label_smoothing_factor: 0.1

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.9231 3 0.4160 0.8791
No log 1.8462 6 0.4668 0.8388
No log 2.7692 9 0.5433 0.8022
0.3869 4.0 13 0.5052 0.8168
0.3869 4.9231 16 0.4591 0.8571
0.3869 5.8462 19 0.4820 0.8278
0.3658 6.7692 22 0.4953 0.8095
0.3658 8.0 26 0.4497 0.8608
0.3658 8.9231 29 0.4686 0.8315
0.3439 9.8462 32 0.4506 0.8608
0.3439 10.7692 35 0.4859 0.8168
0.3439 12.0 39 0.4929 0.8168
0.3416 12.9231 42 0.4957 0.8059
0.3416 13.8462 45 0.5229 0.7875
0.3416 14.7692 48 0.4473 0.8535
0.324 16.0 52 0.5260 0.8059
0.324 16.9231 55 0.4582 0.8462
0.324 17.8462 58 0.5299 0.7839
0.3273 18.7692 61 0.4947 0.8205
0.3273 20.0 65 0.5393 0.7692
0.3273 20.9231 68 0.4916 0.8278
0.3397 21.8462 71 0.5360 0.7802
0.3397 22.7692 74 0.5661 0.7656
0.3397 24.0 78 0.6354 0.7216
0.3344 24.9231 81 0.6782 0.7033
0.3344 25.8462 84 0.5704 0.7582
0.3344 26.7692 87 0.6537 0.6777
0.3325 28.0 91 0.4798 0.8425
0.3325 28.9231 94 0.5158 0.8059
0.3325 29.8462 97 0.5408 0.7912
0.3283 30.7692 100 0.5964 0.7399
0.3283 32.0 104 0.5069 0.8205
0.3283 32.9231 107 0.5396 0.7875
0.3229 33.8462 110 0.5203 0.7985
0.3229 34.7692 113 0.5464 0.7875
0.3229 36.0 117 0.5890 0.7509
0.3207 36.9231 120 0.5080 0.8132
0.3207 37.8462 123 0.4944 0.8168
0.3207 38.7692 126 0.4968 0.8095
0.3286 40.0 130 0.4874 0.8132
0.3286 40.9231 133 0.5013 0.8059
0.3286 41.8462 136 0.5329 0.7656
0.3286 42.7692 139 0.6199 0.6996
0.3154 44.0 143 0.4854 0.8059
0.3154 44.9231 146 0.5545 0.7509
0.3154 45.8462 149 0.5267 0.7729
0.3119 46.7692 152 0.5214 0.7802
0.3119 48.0 156 0.5265 0.7839
0.3119 48.9231 159 0.5137 0.7985
0.3036 49.8462 162 0.5354 0.7839
0.3036 50.7692 165 0.5269 0.7875
0.3036 52.0 169 0.5797 0.7399
0.2995 52.9231 172 0.6258 0.7179
0.2995 53.8462 175 0.5512 0.7692
0.2995 54.7692 178 0.5517 0.7619
0.306 56.0 182 0.5590 0.7546
0.306 56.9231 185 0.5514 0.7619
0.306 57.8462 188 0.5597 0.7509
0.2989 58.7692 191 0.5957 0.7326
0.2989 60.0 195 0.5366 0.7766
0.2989 60.9231 198 0.5465 0.7729
0.2931 61.8462 201 0.6171 0.7253
0.2931 62.7692 204 0.5768 0.7509
0.2931 64.0 208 0.5706 0.7509
0.299 64.9231 211 0.5962 0.7363
0.299 65.8462 214 0.6220 0.7216
0.299 66.7692 217 0.5929 0.7363
0.2969 68.0 221 0.6136 0.7253
0.2969 68.9231 224 0.6092 0.7289
0.2969 69.8462 227 0.6029 0.7253
0.3015 70.7692 230 0.5356 0.7766
0.3015 72.0 234 0.5376 0.7692
0.3015 72.9231 237 0.5886 0.7436
0.2919 73.8462 240 0.5869 0.7436
0.2919 74.7692 243 0.5846 0.7473
0.2919 76.0 247 0.5507 0.7656
0.288 76.9231 250 0.5801 0.7509
0.288 77.8462 253 0.6077 0.7399
0.288 78.7692 256 0.5848 0.7436
0.2951 80.0 260 0.5435 0.7692
0.2951 80.9231 263 0.5638 0.7656
0.2951 81.8462 266 0.5795 0.7399
0.2951 82.7692 269 0.5774 0.7509
0.2875 84.0 273 0.5703 0.7509
0.2875 84.9231 276 0.5713 0.7509
0.2875 85.8462 279 0.5784 0.7473
0.2855 86.7692 282 0.5904 0.7436
0.2855 88.0 286 0.5917 0.7326
0.2855 88.9231 289 0.5860 0.7473
0.2964 89.8462 292 0.5858 0.7473
0.2964 90.7692 295 0.5823 0.7436
0.2964 92.0 299 0.5817 0.7436
0.291 92.3077 300 0.5816 0.7436

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.19.1