gpt2-wikitext / README.md
shivanandmn's picture
Model save
86a95c4 verified
|
raw
history blame
2.88 kB
metadata
library_name: transformers
tags:
  - generated_from_trainer
metrics:
  - accuracy
  - bleu
model-index:
  - name: gpt2-wikitext
    results: []

gpt2-wikitext

This model is a fine-tuned version of on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.1672
  • Accuracy: 0.4214
  • Perplexity: 23.7410
  • Bleu: 0.1474

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0001
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Accuracy Perplexity Bleu
6.0797 0.2806 500 5.9513 0.1886 384.2374 0.0328
5.0682 0.5612 1000 4.9243 0.2616 137.5896 0.0640
4.3376 0.8418 1500 4.2176 0.3220 67.8687 0.0837
3.946 1.1223 2000 3.8835 0.3531 48.5917 0.1062
3.7578 1.4029 2500 3.7072 0.3683 40.7404 0.1239
3.6484 1.6835 3000 3.5824 0.3790 35.9600 0.1312
3.5434 1.9641 3500 3.4942 0.3874 32.9255 0.1354
3.4228 2.2447 4000 3.4280 0.3938 30.8151 0.1332
3.3604 2.5253 4500 3.3732 0.3989 29.1719 0.1409
3.3288 2.8058 5000 3.3268 0.4039 27.8495 0.1396
3.2073 3.0864 5500 3.2888 0.4079 26.8096 0.1410
3.2009 3.3670 6000 3.2614 0.4109 26.0870 0.1426
3.1787 3.6476 6500 3.2330 0.4141 25.3543 0.1394
3.1528 3.9282 7000 3.2094 0.4164 24.7635 0.1459
3.0849 4.2088 7500 3.1927 0.4182 24.3547 0.1420
3.0471 4.4893 8000 3.1799 0.4200 24.0448 0.1476
3.0571 4.7699 8500 3.1672 0.4214 23.7410 0.1474

Framework versions

  • Transformers 4.49.0
  • Pytorch 2.6.0+cu124
  • Datasets 3.3.2
  • Tokenizers 0.21.0