speecht5_tts / README.md
JBZhang2342's picture
Model save
c8bac1b
|
raw
history blame
3.4 kB
metadata
license: mit
base_model: microsoft/speecht5_tts
tags:
  - generated_from_trainer
model-index:
  - name: speecht5_tts
    results: []

speecht5_tts

This model is a fine-tuned version of microsoft/speecht5_tts on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.5139

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1e-05
  • train_batch_size: 2
  • eval_batch_size: 2
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_steps: 500
  • training_steps: 10000
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
No log 1.92 250 0.6150
0.7483 3.85 500 0.5494
0.7483 5.77 750 0.5001
0.5482 7.69 1000 0.4861
0.5482 9.62 1250 0.4792
0.502 11.54 1500 0.4786
0.502 13.46 1750 0.4804
0.4794 15.38 2000 0.4803
0.4794 17.31 2250 0.4724
0.4685 19.23 2500 0.4801
0.4685 21.15 2750 0.4740
0.4553 23.08 3000 0.4840
0.4553 25.0 3250 0.4857
0.4567 26.92 3500 0.4792
0.4567 28.85 3750 0.4831
0.445 30.77 4000 0.4884
0.445 32.69 4250 0.4845
0.4412 34.62 4500 0.4944
0.4412 36.54 4750 0.4940
0.4373 38.46 5000 0.4863
0.4373 40.38 5250 0.4899
0.4353 42.31 5500 0.4954
0.4353 44.23 5750 0.5005
0.4265 46.15 6000 0.4994
0.4265 48.08 6250 0.4918
0.4285 50.0 6500 0.5022
0.4285 51.92 6750 0.4939
0.4209 53.85 7000 0.4989
0.4209 55.77 7250 0.4959
0.4206 57.69 7500 0.5013
0.4206 59.62 7750 0.5061
0.4189 61.54 8000 0.5092
0.4189 63.46 8250 0.5084
0.422 65.38 8500 0.5116
0.422 67.31 8750 0.5115
0.415 69.23 9000 0.5100
0.415 71.15 9250 0.5121
0.4179 73.08 9500 0.5112
0.4179 75.0 9750 0.5115
0.4139 76.92 10000 0.5139

Framework versions

  • Transformers 4.36.0.dev0
  • Pytorch 2.1.0+cu121
  • Datasets 2.15.0
  • Tokenizers 0.14.1