8_CTMAE_1

This model is a fine-tuned version of MCG-NJU/videomae-large-finetuned-kinetics on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7375
  • Accuracy: 0.8696

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 1
  • eval_batch_size: 1
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • training_steps: 21900

Training results

Training Loss Epoch Step Validation Loss Accuracy
0.5188 0.02 438 0.6840 0.4783
0.5068 1.02 876 2.8475 0.4565
1.8316 2.02 1314 1.5493 0.4565
0.5219 3.02 1752 2.8128 0.4565
1.1618 4.02 2190 1.0596 0.5435
1.5861 5.02 2628 1.9090 0.4565
0.7609 6.02 3066 2.0779 0.4565
0.3447 7.02 3504 1.0396 0.5435
1.1399 8.02 3942 1.7504 0.4565
1.6307 9.02 4380 1.5973 0.4565
0.0175 10.02 4818 2.4790 0.4565
0.2556 11.02 5256 1.4058 0.4565
0.3923 12.02 5694 0.7601 0.5435
1.1171 13.02 6132 1.1797 0.5435
0.351 14.02 6570 1.6103 0.4783
1.0928 15.02 7008 0.5065 0.7826
0.5417 16.02 7446 1.1772 0.6957
0.0031 17.02 7884 1.7810 0.6087
0.0262 18.02 8322 0.8326 0.7174
0.0069 19.02 8760 0.7989 0.7174
0.002 20.02 9198 1.3372 0.6957
0.0023 21.02 9636 1.0854 0.7391
1.0456 22.02 10074 1.8901 0.6739
1.2039 23.02 10512 1.5616 0.6957
0.5863 24.02 10950 2.0744 0.6087
0.0015 25.02 11388 1.6195 0.6957
0.5301 26.02 11826 1.3184 0.6957
0.0041 27.02 12264 2.4978 0.5652
1.1867 28.02 12702 0.7375 0.8696
1.0796 29.02 13140 1.2784 0.7391
0.4999 30.02 13578 0.6682 0.8696
0.0737 31.02 14016 0.8525 0.7826
0.2811 32.02 14454 0.8788 0.8261
0.197 33.02 14892 1.0332 0.7826
0.0005 34.02 15330 1.1382 0.8043
0.0006 35.02 15768 3.0701 0.5435
0.0007 36.02 16206 1.7615 0.6304
0.0006 37.02 16644 2.2577 0.6522
0.0004 38.02 17082 1.1589 0.7609
0.0006 39.02 17520 1.0367 0.8043
0.0005 40.02 17958 2.3663 0.6522
0.0003 41.02 18396 2.4580 0.6522
0.0005 42.02 18834 1.7281 0.6957
0.0 43.02 19272 1.2404 0.8043
0.424 44.02 19710 2.6335 0.6522
0.0002 45.02 20148 2.4237 0.6304
0.0002 46.02 20586 2.5241 0.6522
0.0001 47.02 21024 2.1765 0.6522
0.0001 48.02 21462 2.2171 0.6739
0.0001 49.02 21900 2.4228 0.6304

Framework versions

  • Transformers 4.46.2
  • Pytorch 2.0.1+cu117
  • Datasets 3.0.1
  • Tokenizers 0.20.0
Downloads last month
13
Safetensors
Model size
304M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported Inference Providers.

Model tree for beingbatman/8_CTMAE_1

Finetuned
(54)
this model