File size: 3,928 Bytes
ea81da4
 
3c6c6c9
 
 
 
 
 
 
 
 
ea81da4
 
3c6c6c9
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
---
library_name: transformers
license: apache-2.0
base_model: openai/whisper-large-v2
tags:
- generated_from_trainer
datasets:
- generator
model-index:
- name: whisper-large-v2-multilingual
  results: []
---

<!-- This model card has been generated automatically according to the information the Trainer had access to. You
should probably proofread and complete it, then remove this comment. -->

# whisper-large-v2-multilingual

This model is a fine-tuned version of [openai/whisper-large-v2](https://huggingface.co/openai/whisper-large-v2) on the generator dataset.
It achieves the following results on the evaluation set:
- Loss: 0.3502
- Wer Eng: 0.01
- Wer Lug: 0.106
- Wer Ach: 0.241
- Wer Lgg: 0.361
- Wer Teo: 0.327
- Wer Nyn: 0.387
- Wer Mean: 0.239
- Cer Eng: 0.004
- Cer Lug: 0.029
- Cer Ach: 0.064
- Cer Lgg: 0.118
- Cer Teo: 0.145
- Cer Nyn: 0.125
- Cer Mean: 0.081

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- learning_rate: 1e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 32
- total_train_batch_size: 256
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 50
- training_steps: 2000
- mixed_precision_training: Native AMP

### Training results

| Training Loss | Epoch  | Step | Validation Loss | Wer Eng | Wer Lug | Wer Ach | Wer Lgg | Wer Teo | Wer Nyn | Wer Mean | Cer Eng | Cer Lug | Cer Ach | Cer Lgg | Cer Teo | Cer Nyn | Cer Mean |
|:-------------:|:------:|:----:|:---------------:|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:|:--------:|:-------:|:-------:|:-------:|:-------:|:-------:|:-------:|:--------:|
| 1.0754        | 0.1    | 200  | 0.6042          | 0.031   | 0.267   | 0.428   | 0.481   | 0.615   | 0.712   | 0.422    | 0.017   | 0.053   | 0.121   | 0.147   | 0.242   | 0.201   | 0.13     |
| 0.5269        | 1.0001 | 400  | 0.4731          | 0.018   | 0.176   | 0.34    | 0.422   | 0.45    | 0.531   | 0.323    | 0.008   | 0.04    | 0.093   | 0.126   | 0.174   | 0.156   | 0.099    |
| 0.4203        | 1.1001 | 600  | 0.4230          | 0.018   | 0.199   | 0.319   | 0.416   | 0.418   | 0.482   | 0.309    | 0.007   | 0.06    | 0.094   | 0.122   | 0.163   | 0.146   | 0.099    |
| 0.3851        | 2.0002 | 800  | 0.3871          | 0.014   | 0.116   | 0.271   | 0.374   | 0.443   | 0.454   | 0.279    | 0.006   | 0.03    | 0.073   | 0.11    | 0.19    | 0.137   | 0.091    |
| 0.3241        | 2.1002 | 1000 | 0.3716          | 0.015   | 0.12    | 0.271   | 0.392   | 0.397   | 0.416   | 0.269    | 0.006   | 0.03    | 0.077   | 0.133   | 0.166   | 0.127   | 0.09     |
| 0.3161        | 3.0004 | 1200 | 0.3584          | 0.021   | 0.121   | 0.269   | 0.429   | 0.36    | 0.38    | 0.263    | 0.008   | 0.032   | 0.071   | 0.154   | 0.16    | 0.121   | 0.091    |
| 0.2764        | 3.1004 | 1400 | 0.3546          | 0.012   | 0.116   | 0.254   | 0.376   | 0.348   | 0.403   | 0.251    | 0.004   | 0.03    | 0.073   | 0.123   | 0.155   | 0.125   | 0.085    |
| 0.2692        | 4.0005 | 1600 | 0.3487          | 0.011   | 0.107   | 0.248   | 0.352   | 0.336   | 0.377   | 0.238    | 0.004   | 0.029   | 0.067   | 0.102   | 0.15    | 0.123   | 0.079    |
| 0.2427        | 4.1005 | 1800 | 0.3535          | 0.01    | 0.113   | 0.24    | 0.384   | 0.329   | 0.387   | 0.244    | 0.004   | 0.03    | 0.066   | 0.122   | 0.145   | 0.125   | 0.082    |
| 0.2413        | 5.0006 | 2000 | 0.3502          | 0.01    | 0.106   | 0.241   | 0.361   | 0.327   | 0.387   | 0.239    | 0.004   | 0.029   | 0.064   | 0.118   | 0.145   | 0.125   | 0.081    |


### Framework versions

- Transformers 4.44.2
- Pytorch 2.2.1
- Datasets 2.21.0
- Tokenizers 0.19.1