t5-base_readme_summarization
This model is a fine-tuned version of t5-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.7573
- Rouge1: 0.4859
- Rouge2: 0.3402
- Rougel: 0.4581
- Rougelsum: 0.4581
- Gen Len: 14.1882
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 2e-05
- train_batch_size: 4
- eval_batch_size: 4
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 5
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
---|---|---|---|---|---|---|---|---|
2.1761 | 1.0 | 1458 | 1.8974 | 0.4769 | 0.3281 | 0.4486 | 0.4484 | 14.265 |
1.9982 | 2.0 | 2916 | 1.8329 | 0.4819 | 0.3349 | 0.4553 | 0.4552 | 14.0492 |
1.8626 | 3.0 | 4374 | 1.7946 | 0.4793 | 0.3343 | 0.4528 | 0.4529 | 14.5971 |
1.8013 | 4.0 | 5832 | 1.7695 | 0.4873 | 0.3418 | 0.4609 | 0.4614 | 14.1691 |
1.7478 | 5.0 | 7290 | 1.7573 | 0.4859 | 0.3402 | 0.4581 | 0.4581 | 14.1882 |
Framework versions
- Transformers 4.35.0
- Pytorch 2.1.0+cu118
- Datasets 2.14.6
- Tokenizers 0.14.1
- Downloads last month
- 3
Inference Providers
NEW
This model isn't deployed by any Inference Provider.
๐
Ask for provider support
Model tree for bunbohue/t5-base_readme_summarization
Base model
google-t5/t5-base