9_mae_1

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5411
  • Accuracy: 0.9348

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 9700

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.6763 0.0201 195 0.8186 0.4565
0.5598 1.0201 390 0.6849 0.4565
0.5409 2.0201 585 0.4701 0.7826
0.4927 3.0201 780 0.4634 0.7826
0.4862 4.0201 975 0.3988 0.8261
0.4784 5.0201 1170 0.6177 0.7391
0.3523 6.0201 1365 0.7854 0.6087
0.4808 7.0201 1560 1.5676 0.4783
0.9314 8.0201 1755 0.3879 0.8478
0.4025 9.0201 1950 0.4501 0.7826
0.47 10.0201 2145 0.4795 0.7826
0.3665 11.0201 2340 0.4920 0.8043
0.3445 12.0201 2535 0.5613 0.7391
0.3533 13.0201 2730 0.4344 0.8261
0.5705 14.0201 2925 1.2464 0.6087
0.3866 15.0201 3120 0.4108 0.8261
0.7164 16.0201 3315 0.3473 0.8696
0.9217 17.0201 3510 0.3302 0.8478
0.8893 18.0201 3705 0.5586 0.8043
0.4097 19.0201 3900 0.5201 0.7826
0.7003 20.0201 4095 0.6116 0.8478
0.4199 21.0201 4290 0.7902 0.7391
0.2288 22.0201 4485 0.7856 0.8261
0.205 23.0201 4680 0.5854 0.8478
0.312 24.0201 4875 0.6283 0.8696
0.4002 25.0201 5070 0.8241 0.8261
0.4637 26.0201 5265 1.0216 0.7826
0.1429 27.0201 5460 0.9300 0.8043
0.4351 28.0201 5655 1.0207 0.8043
0.2324 29.0201 5850 1.3275 0.7391
0.3545 30.0201 6045 1.1007 0.8261
0.1156 31.0201 6240 1.2325 0.8261
0.3738 32.0201 6435 1.0279 0.8261
0.2009 33.0201 6630 1.5226 0.7391
0.1003 34.0201 6825 0.6896 0.8478
0.2604 35.0201 7020 0.6089 0.8913
0.0003 36.0201 7215 0.7792 0.8696
0.1185 37.0201 7410 0.5411 0.9348
0.1388 38.0201 7605 0.5653 0.9130
0.713 39.0201 7800 0.5521 0.9348
0.3258 40.0201 7995 0.5768 0.9348
0.0009 41.0201 8190 0.7341 0.8696
0.0802 42.0201 8385 0.7582 0.8478
0.2027 43.0201 8580 1.2214 0.8261
0.1575 44.0201 8775 1.0244 0.8696
0.0298 45.0201 8970 0.9467 0.8696
0.0002 46.0201 9165 0.9423 0.8696
0.0785 47.0201 9360 0.9311 0.8478
0.0056 48.0201 9555 0.9431 0.8478
0.0178 49.0149 9700 0.9325 0.8478

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
678
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for beingbatman/9_mae_1

Finetuned
(60)
this model