Update README.md
Browse files
README.md
CHANGED
@@ -22,11 +22,13 @@ Huge Thanks to [Johnathan Duering](https://github.com/duerig) for his help. I mo
|
|
22 |
|
23 |
**This is highly experimental, I have not conducted a full session training. I just tested that the loss goes down and the eval samples sound reasonable for ~10K steps of minimal training.**
|
24 |
|
|
|
25 |
|
26 |
-
|
|
|
27 |
|
28 |
the other checkpoint is HiFTNet, 44.1khz on more than 1100 Hours of Multilingual data, sourced privately by myself. it includes Arabic, Persian, Japanese, English and Russian. this one is trained for ~100K steps.
|
29 |
-
Ideally both should be trained up to 1M steps, so I strongly recommend you to further fine-tune it on your own downstream task until I pre-train these for more steps
|
30 |
|
31 |
## Pre-requisites
|
32 |
1. Python >= 3.10
|
|
|
22 |
|
23 |
**This is highly experimental, I have not conducted a full session training. I just tested that the loss goes down and the eval samples sound reasonable for ~10K steps of minimal training.**
|
24 |
|
25 |
+
____________________________________________________________________________________
|
26 |
|
27 |
+
```
|
28 |
+
**NOTE**: I have uploaded Two checkpoints so far. one is 24khz for HiFormer, trained for roughly 117K~ steps on LibriTTS (360 + 100) and 40 hours of other English datasets.
|
29 |
|
30 |
the other checkpoint is HiFTNet, 44.1khz on more than 1100 Hours of Multilingual data, sourced privately by myself. it includes Arabic, Persian, Japanese, English and Russian. this one is trained for ~100K steps.
|
31 |
+
Ideally both should be trained up to 1M steps, so I strongly recommend you to further fine-tune it on your own downstream task until I pre-train these for more steps.```
|
32 |
|
33 |
## Pre-requisites
|
34 |
1. Python >= 3.10
|