product_recommendation

This model is a fine-tuned version of t5-large on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 0.4953
  • Rouge1: 73.0159
  • Rouge2: 66.6667
  • Rougel: 72.2222
  • Rougelsum: 72.2222
  • Gen Len: 4.1905

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 16
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 10

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 0.96 6 0.4314 60.3175 47.6190 59.8413 60.3175 4.1429
No log 1.96 12 0.4339 52.6984 38.0952 53.1746 52.3810 4.0952
No log 2.96 18 0.5350 65.0794 52.3810 64.2857 64.9206 4.4286
No log 3.96 24 0.3075 72.8571 61.9048 72.1429 72.1429 4.1905
No log 4.96 30 0.4016 74.6032 66.6667 74.6032 75.3968 4.3333
No log 5.96 36 0.4496 76.1905 71.4286 74.6032 74.6032 4.1905
No log 6.96 42 0.5539 60.3175 57.1429 61.9048 60.3175 4.0
No log 7.96 48 0.3816 80.9524 76.1905 79.3651 79.3651 4.1905
No log 8.96 54 0.4602 74.6032 71.4286 74.6032 74.6032 4.1429
No log 9.96 60 0.4953 73.0159 66.6667 72.2222 72.2222 4.1905

Framework versions

  • Transformers 4.26.0
  • Pytorch 2.0.1+cu118
  • Datasets 2.8.0
  • Tokenizers 0.13.3
Downloads last month
6
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.