File size: 9,668 Bytes
3473893
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
---
library_name: transformers
license: mit
base_model: xlm-roberta-large
tags:
- generated_from_trainer
metrics:
- accuracy
model-index:
- name: emotion_model_improved
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# emotion_model_improved

This model is a fine-tuned version of [xlm-roberta-large](https://huggingface.co/xlm-roberta-large) on the None dataset.
It achieves the following results on the evaluation set:
- Loss: 0.2881
- Macro F1: 0.5947
- Micro F1: 0.6896
- Accuracy: 0.8522
- F1 Anger: 0.8051
- Precision Anger: 0.7756
- Recall Anger: 0.8368
- F1 Anticipation: 0.3591
- Precision Anticipation: 0.3484
- Recall Anticipation: 0.3705
- F1 Disgust: 0.7122
- Precision Disgust: 0.6203
- Recall Disgust: 0.8360
- F1 Fear: 0.7222
- Precision Fear: 0.6506
- Recall Fear: 0.8115
- F1 Joy: 0.8601
- Precision Joy: 0.8641
- Recall Joy: 0.8561
- F1 Sadness: 0.7075
- Precision Sadness: 0.6030
- Recall Sadness: 0.8558
- F1 Surprise: 0.2393
- Precision Surprise: 0.3305
- Recall Surprise: 0.1875
- F1 Trust: 0.2643
- Precision Trust: 0.2242
- Recall Trust: 0.3217
- F1 Love: 0.6566
- Precision Love: 0.7855
- Recall Love: 0.5640
- F1 Optimism: 0.7413
- Precision Optimism: 0.7730
- Recall Optimism: 0.7122
- F1 Pessimism: 0.4745
- Precision Pessimism: 0.3367
- Recall Pessimism: 0.8032
- Positive Predictions Pct: 25.8683
- Positive Labels Pct: 21.7367

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 8e-06
- train_batch_size: 32
- eval_batch_size: 64
- seed: 42
- gradient_accumulation_steps: 4
- total_train_batch_size: 128
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: cosine
- lr_scheduler_warmup_ratio: 0.1
- num_epochs: 15
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch | Step | Validation Loss | Macro F1 | Micro F1 | Accuracy | F1 Anger | Precision Anger | Recall Anger | F1 Anticipation | Precision Anticipation | Recall Anticipation | F1 Disgust | Precision Disgust | Recall Disgust | F1 Fear | Precision Fear | Recall Fear | F1 Joy | Precision Joy | Recall Joy | F1 Sadness | Precision Sadness | Recall Sadness | F1 Surprise | Precision Surprise | Recall Surprise | F1 Trust | Precision Trust | Recall Trust | F1 Love | Precision Love | Recall Love | F1 Optimism | Precision Optimism | Recall Optimism | F1 Pessimism | Precision Pessimism | Recall Pessimism | Positive Predictions Pct | Positive Labels Pct |
|:-------------:|:-----:|:----:|:---------------:|:--------:|:--------:|:--------:|:--------:|:---------------:|:------------:|:---------------:|:----------------------:|:-------------------:|:----------:|:-----------------:|:--------------:|:-------:|:--------------:|:-----------:|:------:|:-------------:|:----------:|:----------:|:-----------------:|:--------------:|:-----------:|:------------------:|:---------------:|:--------:|:---------------:|:------------:|:-------:|:--------------:|:-----------:|:-----------:|:------------------:|:---------------:|:------------:|:-------------------:|:----------------:|:------------------------:|:-------------------:|
| 0.6834        | 1.0   | 72   | 0.4816          | 0.2295   | 0.4570   | 0.6345   | 0.5297   | 0.3603          | 1.0          | 0.0             | 0.0                    | 0.0                 | 0.4483     | 0.2889            | 1.0            | 0.0     | 0.0            | 0.0         | 0.5649 | 0.3936        | 1.0        | 0.4936     | 0.3277            | 1.0            | 0.0         | 0.0                | 0.0             | 0.0      | 0.0             | 0.0          | 0.0     | 0.0            | 0.0         | 0.4884      | 0.3238             | 0.9937          | 0.0          | 0.0                 | 0.0              | 45.3927                  | 21.9208             |
| 0.4738        | 2.0   | 144  | 0.3507          | 0.4607   | 0.6320   | 0.7951   | 0.7069   | 0.5593          | 0.9604       | 0.0             | 0.0                    | 0.0                 | 0.6359     | 0.4807            | 0.9388         | 0.4906  | 0.3694         | 0.7302      | 0.8185 | 0.7592        | 0.8877     | 0.6246     | 0.4692            | 0.9336         | 0.0         | 0.0                | 0.0             | 0.0      | 0.0             | 0.0          | 0.6323  | 0.5235         | 0.7980      | 0.7368      | 0.6486             | 0.8529          | 0.4223       | 0.2951              | 0.7422           | 33.7556                  | 21.9208             |
| 0.3445        | 3.0   | 216  | 0.3131          | 0.5845   | 0.6929   | 0.8585   | 0.7807   | 0.8             | 0.7623       | 0.3463          | 0.4804                 | 0.2707              | 0.7273     | 0.6953            | 0.7624         | 0.7003  | 0.7637         | 0.6465      | 0.8477 | 0.8140        | 0.8843     | 0.7385     | 0.6994            | 0.7822         | 0.1839      | 0.2051             | 0.1667          | 0.2020   | 0.1360          | 0.3924       | 0.6915  | 0.6374         | 0.7557      | 0.7637      | 0.7268             | 0.8046          | 0.4473       | 0.3785              | 0.5467           | 24.1394                  | 21.9208             |
| 0.3076        | 4.0   | 288  | 0.3035          | 0.5792   | 0.6867   | 0.8561   | 0.7741   | 0.7691          | 0.7792       | 0.3486          | 0.3904                 | 0.3149              | 0.7255     | 0.6826            | 0.7741         | 0.6789  | 0.7818         | 0.6         | 0.8348 | 0.7760        | 0.9033     | 0.7201     | 0.7044            | 0.7365         | 0.2316      | 0.2340             | 0.2292          | 0.1949   | 0.1364          | 0.3418       | 0.6912  | 0.6792         | 0.7036      | 0.7587      | 0.7018             | 0.8256          | 0.4125       | 0.3882              | 0.44             | 24.0158                  | 21.9208             |
| 0.2836        | 5.0   | 360  | 0.2969          | 0.6002   | 0.7045   | 0.8648   | 0.7859   | 0.7927          | 0.7792       | 0.3462          | 0.4122                 | 0.2983              | 0.7387     | 0.6950            | 0.7882         | 0.7364  | 0.6926         | 0.7860      | 0.8543 | 0.8160        | 0.8964     | 0.7339     | 0.7137            | 0.7552         | 0.2735      | 0.2319             | 0.3333          | 0.2190   | 0.1756          | 0.2911       | 0.6983  | 0.6779         | 0.7199      | 0.7653      | 0.7440             | 0.7878          | 0.4508       | 0.3927              | 0.5289           | 23.8366                  | 21.9208             |
| 0.27          | 6.0   | 432  | 0.2930          | 0.6238   | 0.6993   | 0.8541   | 0.8007   | 0.7733          | 0.8302       | 0.3858          | 0.4167                 | 0.3591              | 0.7349     | 0.6604            | 0.8282         | 0.7578  | 0.7316         | 0.7860      | 0.8506 | 0.8412        | 0.8601     | 0.7366     | 0.7045            | 0.7718         | 0.4051      | 0.5161             | 0.3333          | 0.2334   | 0.1477          | 0.5570       | 0.7170  | 0.6466         | 0.8046      | 0.7745      | 0.7596             | 0.7899          | 0.4650       | 0.3395              | 0.7378           | 26.5991                  | 21.9208             |
| 0.2587        | 7.0   | 504  | 0.2888          | 0.6137   | 0.6969   | 0.8525   | 0.7948   | 0.7756          | 0.8151       | 0.3526          | 0.3697                 | 0.3370              | 0.7387     | 0.6667            | 0.8282         | 0.7348  | 0.7704         | 0.7023      | 0.8528 | 0.8406        | 0.8653     | 0.7384     | 0.6776            | 0.8112         | 0.3505      | 0.3469             | 0.3542          | 0.2185   | 0.1403          | 0.4937       | 0.7166  | 0.6524         | 0.7948      | 0.7703      | 0.7370             | 0.8067          | 0.4831       | 0.3532              | 0.7644           | 26.7474                  | 21.9208             |
| 0.248         | 8.0   | 576  | 0.2865          | 0.6177   | 0.6960   | 0.8520   | 0.7923   | 0.7691          | 0.8170       | 0.3802          | 0.3596                 | 0.4033              | 0.7329     | 0.6712            | 0.8071         | 0.7379  | 0.7716         | 0.7070      | 0.8560 | 0.8219        | 0.8929     | 0.7317     | 0.7096            | 0.7552         | 0.3738      | 0.3390             | 0.4167          | 0.2259   | 0.1444          | 0.5190       | 0.7233  | 0.6486         | 0.8176      | 0.7671      | 0.7261             | 0.8130          | 0.4734       | 0.3548              | 0.7111           | 26.7598                  | 21.9208             |
| 0.2404        | 9.0   | 648  | 0.2865          | 0.6219   | 0.7087   | 0.8617   | 0.7959   | 0.7900          | 0.8019       | 0.3913          | 0.3850                 | 0.3978              | 0.7417     | 0.6811            | 0.8141         | 0.7489  | 0.6902         | 0.8186      | 0.8579 | 0.8361        | 0.8808     | 0.7390     | 0.6831            | 0.8050         | 0.3542      | 0.3542             | 0.3542          | 0.2368   | 0.1812          | 0.3418       | 0.7254  | 0.7721         | 0.6840      | 0.7747      | 0.7525             | 0.7983          | 0.475        | 0.3455              | 0.76             | 25.5547                  | 21.9208             |


### Framework versions

- Transformers 4.48.2
- Pytorch 2.3.1.post300
- Datasets 2.2.1
- Tokenizers 0.21.0