metadata
license: mit
base_model: facebook/bart-large-cnn
tags:
- generated_from_trainer
metrics:
- rouge
model-index:
- name: bart-large-cnn-finetuned-scope-summarization
results: []
bart-large-cnn-finetuned-scope-summarization
This model is a fine-tuned version of facebook/bart-large-cnn on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.0552
- Rouge1: 49.8374
- Rouge2: 38.0885
- Rougel: 42.6985
- Rougelsum: 42.4809
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5.6e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 20
Training results
Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum |
---|---|---|---|---|---|---|---|
0.6831 | 1.0 | 43 | 0.3928 | 40.6965 | 25.3494 | 30.1716 | 29.9938 |
0.3578 | 2.0 | 86 | 0.3598 | 43.284 | 27.9071 | 32.9941 | 32.9077 |
0.3302 | 3.0 | 129 | 0.3362 | 45.2375 | 30.4709 | 34.8733 | 34.6801 |
0.309 | 4.0 | 172 | 0.3136 | 44.928 | 30.8601 | 34.7804 | 34.6754 |
0.2948 | 5.0 | 215 | 0.2919 | 44.5169 | 30.2429 | 34.5979 | 34.4672 |
0.2841 | 6.0 | 258 | 0.2755 | 45.7172 | 31.6555 | 34.9668 | 34.9069 |
0.268 | 7.0 | 301 | 0.2618 | 46.4085 | 32.782 | 35.804 | 35.6348 |
0.252 | 8.0 | 344 | 0.2424 | 47.8634 | 33.6728 | 36.9559 | 36.9081 |
0.2405 | 9.0 | 387 | 0.2286 | 46.8182 | 34.4363 | 37.7534 | 37.6356 |
0.2255 | 10.0 | 430 | 0.2276 | 46.8516 | 33.3166 | 37.6246 | 37.5024 |
0.2125 | 11.0 | 473 | 0.1946 | 47.6772 | 33.9627 | 37.8554 | 37.7735 |
0.1918 | 12.0 | 516 | 0.1682 | 46.851 | 33.6098 | 38.2906 | 38.24 |
0.1726 | 13.0 | 559 | 0.1442 | 48.8833 | 36.4235 | 39.4263 | 39.1955 |
0.152 | 14.0 | 602 | 0.1305 | 50.5835 | 39.2008 | 43.3793 | 43.1671 |
0.1344 | 15.0 | 645 | 0.1109 | 47.3517 | 35.4446 | 38.0845 | 38.0578 |
0.116 | 16.0 | 688 | 0.0842 | 48.9774 | 37.6705 | 41.6306 | 41.4792 |
0.1007 | 17.0 | 731 | 0.0762 | 49.9775 | 38.4186 | 42.647 | 42.4334 |
0.0899 | 18.0 | 774 | 0.0623 | 50.1358 | 38.9943 | 43.4025 | 43.1603 |
0.0805 | 19.0 | 817 | 0.0571 | 51.5974 | 40.1928 | 44.1821 | 43.9354 |
0.0753 | 20.0 | 860 | 0.0552 | 49.8374 | 38.0885 | 42.6985 | 42.4809 |
Framework versions
- Transformers 4.38.2
- Pytorch 2.2.1+cu121
- Datasets 2.18.0
- Tokenizers 0.15.2