--- license: apache-2.0 base_model: t5-large tags: - generated_from_trainer datasets: - glue metrics: - accuracy model-index: - name: t5-large_cola_dense_epochs-7_decoder_all_sparsity10 results: - task: name: Text Classification type: text-classification dataset: name: glue type: glue config: cola split: validation args: cola metrics: - name: Accuracy type: accuracy value: 0.837967401725791 --- # t5-large_cola_dense_epochs-7_decoder_all_sparsity10 This model is a fine-tuned version of [t5-large](https://huggingface.co/t5-large) on the glue dataset. It achieves the following results on the evaluation set: - Loss: 4.6969 - Accuracy: 0.8380 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 5e-05 - train_batch_size: 64 - eval_batch_size: 128 - seed: 1 - distributed_type: multi-GPU - gradient_accumulation_steps: 2 - total_train_batch_size: 128 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 20 - num_epochs: 7 ### Training results | Training Loss | Epoch | Step | Validation Loss | Accuracy | |:-------------:|:-----:|:----:|:---------------:|:--------:| | 0.5441 | 0.37 | 25 | 0.5813 | 0.6913 | | 0.3969 | 0.75 | 50 | 0.5219 | 0.8044 | | 0.3537 | 1.12 | 75 | 0.4713 | 0.8313 | | 0.2905 | 1.49 | 100 | 0.6308 | 0.8150 | | 0.3157 | 1.87 | 125 | 0.4301 | 0.8341 | | 0.2208 | 2.24 | 150 | 2.3147 | 0.8332 | | 0.2231 | 2.61 | 175 | 0.4612 | 0.8341 | | 0.2404 | 2.99 | 200 | 1.5471 | 0.8265 | | 0.1697 | 3.36 | 225 | 0.8701 | 0.8313 | | 0.131 | 3.73 | 250 | 1.2642 | 0.8380 | | 0.1219 | 4.1 | 275 | 0.9926 | 0.8370 | | 0.2647 | 4.48 | 300 | 5.1919 | 0.8341 | | 0.1329 | 4.85 | 325 | 2.2726 | 0.8418 | | 0.0857 | 5.22 | 350 | 4.2193 | 0.8370 | | 0.0989 | 5.6 | 375 | 5.3604 | 0.8389 | | 0.2557 | 5.97 | 400 | 3.0246 | 0.8341 | | 0.2617 | 6.34 | 425 | 5.6630 | 0.8456 | | 0.2526 | 6.72 | 450 | 6.0474 | 0.8360 | ### Framework versions - Transformers 4.34.1 - Pytorch 2.0.1+cu117 - Datasets 2.9.0 - Tokenizers 0.14.1