metadata
language:
- nl
license: apache-2.0
base_model: openai/whisper-large-v2
tags:
- generated_from_trainer
metrics:
- wer
model-index:
- name: Whisper Large V2
results: []
Whisper Large V2
This model is a fine-tuned version of openai/whisper-large-v2 on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2872
- Wer: 10.3543
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 12
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 20
- num_epochs: 5
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.6091 | 0.09 | 30 | 0.3548 | 15.0266 |
0.3073 | 0.19 | 60 | 0.3203 | 13.7016 |
0.3171 | 0.28 | 90 | 0.3049 | 12.6189 |
0.29 | 0.38 | 120 | 0.3033 | 13.9760 |
0.2907 | 0.47 | 150 | 0.2824 | 12.9750 |
0.2748 | 0.57 | 180 | 0.2737 | 13.1413 |
0.2637 | 0.66 | 210 | 0.2655 | 15.0149 |
0.2672 | 0.76 | 240 | 0.2629 | 15.7094 |
0.2483 | 0.85 | 270 | 0.2616 | 13.7483 |
0.2531 | 0.95 | 300 | 0.2603 | 13.5732 |
0.1988 | 1.04 | 330 | 0.2713 | 12.3417 |
0.1271 | 1.14 | 360 | 0.2644 | 12.3942 |
0.1309 | 1.23 | 390 | 0.2612 | 12.6218 |
0.1506 | 1.33 | 420 | 0.2633 | 17.3204 |
0.1365 | 1.42 | 450 | 0.2621 | 13.2551 |
0.1379 | 1.52 | 480 | 0.2636 | 13.2901 |
0.1325 | 1.61 | 510 | 0.2550 | 12.8845 |
0.129 | 1.71 | 540 | 0.2575 | 14.0139 |
0.1334 | 1.8 | 570 | 0.2513 | 12.2104 |
0.1418 | 1.9 | 600 | 0.2484 | 12.2541 |
0.1438 | 1.99 | 630 | 0.2457 | 12.0119 |
0.0651 | 2.09 | 660 | 0.2646 | 12.3358 |
0.0649 | 2.18 | 690 | 0.2684 | 10.6286 |
0.0638 | 2.28 | 720 | 0.2645 | 11.6121 |
0.0651 | 2.37 | 750 | 0.2616 | 11.4020 |
0.0656 | 2.47 | 780 | 0.2574 | 11.4457 |
0.0643 | 2.56 | 810 | 0.2592 | 11.7113 |
0.0682 | 2.66 | 840 | 0.2597 | 11.5625 |
0.0583 | 2.75 | 870 | 0.2571 | 12.9020 |
0.0608 | 2.85 | 900 | 0.2574 | 14.3991 |
0.064 | 2.94 | 930 | 0.2535 | 10.6023 |
0.0429 | 3.04 | 960 | 0.2648 | 10.9788 |
0.0264 | 3.13 | 990 | 0.2710 | 10.3514 |
0.0251 | 3.23 | 1020 | 0.2688 | 10.4302 |
0.0244 | 3.32 | 1050 | 0.2709 | 9.9778 |
0.0251 | 3.42 | 1080 | 0.2732 | 10.1733 |
0.0245 | 3.51 | 1110 | 0.2720 | 11.1043 |
0.0246 | 3.61 | 1140 | 0.2765 | 10.8446 |
0.0254 | 3.7 | 1170 | 0.2709 | 10.7658 |
0.0234 | 3.8 | 1200 | 0.2663 | 10.3485 |
0.022 | 3.89 | 1230 | 0.2649 | 11.4370 |
0.0237 | 3.99 | 1260 | 0.2688 | 11.0138 |
0.011 | 4.08 | 1290 | 0.2791 | 10.3076 |
0.0107 | 4.18 | 1320 | 0.2839 | 10.4798 |
0.0087 | 4.27 | 1350 | 0.2871 | 10.4856 |
0.0081 | 4.37 | 1380 | 0.2894 | 10.3280 |
0.0094 | 4.46 | 1410 | 0.2872 | 10.2259 |
0.0083 | 4.56 | 1440 | 0.2887 | 10.2288 |
0.0104 | 4.65 | 1470 | 0.2856 | 10.2638 |
0.009 | 4.75 | 1500 | 0.2855 | 10.3339 |
0.0068 | 4.84 | 1530 | 0.2865 | 10.4010 |
0.0082 | 4.94 | 1560 | 0.2872 | 10.3543 |
Framework versions
- Transformers 4.38.0.dev0
- Pytorch 2.1.0+cu121
- Datasets 2.17.0
- Tokenizers 0.15.0