File size: 4,645 Bytes
fdd7a8b
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
---
library_name: transformers
license: cc-by-nc-4.0
base_model: MCG-NJU/videomae-large-finetuned-kinetics
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: CTMAE2_CS_V7_3
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# CTMAE2_CS_V7_3

This model is a fine-tuned version of [MCG-NJU/videomae-large-finetuned-kinetics](https://huggingface.co/MCG-NJU/videomae-large-finetuned-kinetics) on an unknown dataset.
It achieves the following results on the evaluation set:
- Loss: 1.0930
- Accuracy: 0.8261

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 3
- eval_batch_size: 3
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_ratio: 0.1
- training_steps: 13000

### Training results

| Training Loss | Epoch | Step  | Validation Loss | Accuracy |
|:-------------:|:-----:|:-----:|:---------------:|:--------:|
| 0.6166        | 0.02  | 260   | 0.7472          | 0.4565   |
| 0.5113        | 1.02  | 520   | 0.9421          | 0.4565   |
| 0.4538        | 2.02  | 780   | 0.9290          | 0.4783   |
| 0.4927        | 3.02  | 1040  | 0.6694          | 0.5870   |
| 0.5214        | 4.02  | 1300  | 0.7649          | 0.7174   |
| 1.0967        | 5.02  | 1560  | 0.9229          | 0.6739   |
| 0.6237        | 6.02  | 1820  | 0.8940          | 0.7174   |
| 0.2162        | 7.02  | 2080  | 0.8480          | 0.7609   |
| 0.4234        | 8.02  | 2340  | 1.3532          | 0.6304   |
| 0.4629        | 9.02  | 2600  | 0.7409          | 0.7391   |
| 0.4631        | 10.02 | 2860  | 0.8471          | 0.7609   |
| 0.548         | 11.02 | 3120  | 0.9673          | 0.6957   |
| 0.355         | 12.02 | 3380  | 0.9122          | 0.7609   |
| 0.6545        | 13.02 | 3640  | 1.0228          | 0.7391   |
| 0.6703        | 14.02 | 3900  | 0.9249          | 0.7174   |
| 0.561         | 15.02 | 4160  | 1.1688          | 0.7174   |
| 0.3788        | 16.02 | 4420  | 1.9633          | 0.6522   |
| 0.3055        | 17.02 | 4680  | 1.1960          | 0.6957   |
| 0.2223        | 18.02 | 4940  | 1.0511          | 0.7609   |
| 0.4324        | 19.02 | 5200  | 1.5567          | 0.6957   |
| 0.3022        | 20.02 | 5460  | 1.6864          | 0.6304   |
| 0.4434        | 21.02 | 5720  | 1.7834          | 0.6304   |
| 0.2485        | 22.02 | 5980  | 1.4761          | 0.6739   |
| 0.3882        | 23.02 | 6240  | 1.8617          | 0.6522   |
| 0.0128        | 24.02 | 6500  | 1.6289          | 0.6739   |
| 0.4251        | 25.02 | 6760  | 1.5492          | 0.6957   |
| 0.0128        | 26.02 | 7020  | 2.4527          | 0.5652   |
| 0.2468        | 27.02 | 7280  | 1.8335          | 0.6739   |
| 0.1681        | 28.02 | 7540  | 1.0796          | 0.8043   |
| 0.4033        | 29.02 | 7800  | 2.3945          | 0.6087   |
| 0.1556        | 30.02 | 8060  | 1.7049          | 0.6739   |
| 0.3012        | 31.02 | 8320  | 1.0930          | 0.8261   |
| 0.0585        | 32.02 | 8580  | 1.5270          | 0.7174   |
| 0.0005        | 33.02 | 8840  | 1.1852          | 0.8261   |
| 0.0008        | 34.02 | 9100  | 1.6258          | 0.7609   |
| 0.117         | 35.02 | 9360  | 1.4406          | 0.7826   |
| 0.1401        | 36.02 | 9620  | 1.7366          | 0.7174   |
| 0.2498        | 37.02 | 9880  | 2.4993          | 0.6304   |
| 0.2411        | 38.02 | 10140 | 2.2741          | 0.6522   |
| 0.0004        | 39.02 | 10400 | 2.0468          | 0.6957   |
| 0.0001        | 40.02 | 10660 | 2.0636          | 0.6522   |
| 0.331         | 41.02 | 10920 | 2.1473          | 0.6522   |
| 0.0112        | 42.02 | 11180 | 1.8257          | 0.6957   |
| 0.2408        | 43.02 | 11440 | 2.2235          | 0.6522   |
| 0.0002        | 44.02 | 11700 | 2.2065          | 0.6739   |
| 0.0001        | 45.02 | 11960 | 2.4907          | 0.6522   |
| 0.0055        | 46.02 | 12220 | 2.2836          | 0.6304   |
| 0.0001        | 47.02 | 12480 | 2.6007          | 0.6304   |
| 0.0003        | 48.02 | 12740 | 2.2538          | 0.6304   |
| 0.0334        | 49.02 | 13000 | 2.3751          | 0.6304   |


### Framework versions

- Transformers 4.46.2
- Pytorch 2.0.1+cu117
- Datasets 3.0.1
- Tokenizers 0.20.0