devkyle commited on
Commit
99d82ca
·
verified ·
1 Parent(s): be948dc

End of training

Browse files
Files changed (1) hide show
  1. README.md +20 -1
README.md CHANGED
@@ -4,6 +4,8 @@ license: apache-2.0
4
  base_model: openai/whisper-small
5
  tags:
6
  - generated_from_trainer
 
 
7
  model-index:
8
  - name: whisper-small-akan
9
  results: []
@@ -15,6 +17,9 @@ should probably proofread and complete it, then remove this comment. -->
15
  # whisper-small-akan
16
 
17
  This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset.
 
 
 
18
 
19
  ## Model description
20
 
@@ -40,9 +45,23 @@ The following hyperparameters were used during training:
40
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
41
  - lr_scheduler_type: linear
42
  - lr_scheduler_warmup_steps: 200
43
- - training_steps: 1000
44
  - mixed_precision_training: Native AMP
45
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
46
  ### Framework versions
47
 
48
  - Transformers 4.44.2
 
4
  base_model: openai/whisper-small
5
  tags:
6
  - generated_from_trainer
7
+ metrics:
8
+ - wer
9
  model-index:
10
  - name: whisper-small-akan
11
  results: []
 
17
  # whisper-small-akan
18
 
19
  This model is a fine-tuned version of [openai/whisper-small](https://huggingface.co/openai/whisper-small) on the None dataset.
20
+ It achieves the following results on the evaluation set:
21
+ - Loss: 0.2208
22
+ - Wer: 9.3196
23
 
24
  ## Model description
25
 
 
45
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
46
  - lr_scheduler_type: linear
47
  - lr_scheduler_warmup_steps: 200
48
+ - training_steps: 2000
49
  - mixed_precision_training: Native AMP
50
 
51
+ ### Training results
52
+
53
+ | Training Loss | Epoch | Step | Validation Loss | Wer |
54
+ |:-------------:|:-----:|:----:|:---------------:|:-------:|
55
+ | 0.0726 | 10.0 | 250 | 0.8217 | 48.0395 |
56
+ | 0.0246 | 20.0 | 500 | 0.9650 | 44.0903 |
57
+ | 0.011 | 30.0 | 750 | 0.9165 | 40.6770 |
58
+ | 0.0022 | 40.0 | 1000 | 0.9419 | 39.9436 |
59
+ | 0.0009 | 50.0 | 1250 | 0.2120 | 9.7770 |
60
+ | 0.0002 | 60.0 | 1500 | 0.2179 | 9.0909 |
61
+ | 0.0001 | 70.0 | 1750 | 0.2201 | 9.3196 |
62
+ | 0.0001 | 80.0 | 2000 | 0.2208 | 9.3196 |
63
+
64
+
65
  ### Framework versions
66
 
67
  - Transformers 4.44.2