layoutlm-FUNSDxSynthetic-5fold
This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the None dataset. It achieves the following results on the evaluation set:
- Loss: 0.0137
- Eader: {'precision': 0.971830985915493, 'recall': 0.971830985915493, 'f1': 0.971830985915493, 'number': 71}
- Nswer: {'precision': 0.9921568627450981, 'recall': 0.98828125, 'f1': 0.990215264187867, 'number': 256}
- Uestion: {'precision': 0.9818181818181818, 'recall': 0.989010989010989, 'f1': 0.9854014598540145, 'number': 273}
- Overall Precision: 0.9850
- Overall Recall: 0.9867
- Overall F1: 0.9858
- Overall Accuracy: 0.9965
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 3e-05
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- num_epochs: 15
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Eader | Nswer | Uestion | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
---|---|---|---|---|---|---|---|---|---|---|
0.0504 | 1.0 | 11 | 0.0183 | {'precision': 0.9722222222222222, 'recall': 0.9859154929577465, 'f1': 0.979020979020979, 'number': 71} | {'precision': 0.984375, 'recall': 0.984375, 'f1': 0.984375, 'number': 256} | {'precision': 0.9782608695652174, 'recall': 0.989010989010989, 'f1': 0.9836065573770493, 'number': 273} | 0.9801 | 0.9867 | 0.9834 | 0.9956 |
0.0459 | 2.0 | 22 | 0.0175 | {'precision': 0.971830985915493, 'recall': 0.971830985915493, 'f1': 0.971830985915493, 'number': 71} | {'precision': 0.98828125, 'recall': 0.98828125, 'f1': 0.98828125, 'number': 256} | {'precision': 0.9747292418772563, 'recall': 0.989010989010989, 'f1': 0.9818181818181818, 'number': 273} | 0.9801 | 0.9867 | 0.9834 | 0.9953 |
0.0289 | 3.0 | 33 | 0.0146 | {'precision': 0.9859154929577465, 'recall': 0.9859154929577465, 'f1': 0.9859154929577465, 'number': 71} | {'precision': 0.9921568627450981, 'recall': 0.98828125, 'f1': 0.990215264187867, 'number': 256} | {'precision': 0.9890909090909091, 'recall': 0.9963369963369964, 'f1': 0.9927007299270072, 'number': 273} | 0.9900 | 0.9917 | 0.9908 | 0.9965 |
0.0243 | 4.0 | 44 | 0.0157 | {'precision': 0.971830985915493, 'recall': 0.971830985915493, 'f1': 0.971830985915493, 'number': 71} | {'precision': 0.9921568627450981, 'recall': 0.98828125, 'f1': 0.990215264187867, 'number': 256} | {'precision': 0.9782608695652174, 'recall': 0.989010989010989, 'f1': 0.9836065573770493, 'number': 273} | 0.9834 | 0.9867 | 0.9850 | 0.9958 |
0.0215 | 5.0 | 55 | 0.0129 | {'precision': 0.971830985915493, 'recall': 0.971830985915493, 'f1': 0.971830985915493, 'number': 71} | {'precision': 0.9921875, 'recall': 0.9921875, 'f1': 0.9921875, 'number': 256} | {'precision': 0.9818181818181818, 'recall': 0.989010989010989, 'f1': 0.9854014598540145, 'number': 273} | 0.9850 | 0.9883 | 0.9867 | 0.9975 |
0.0176 | 6.0 | 66 | 0.0151 | {'precision': 0.971830985915493, 'recall': 0.971830985915493, 'f1': 0.971830985915493, 'number': 71} | {'precision': 0.9921875, 'recall': 0.9921875, 'f1': 0.9921875, 'number': 256} | {'precision': 0.9782608695652174, 'recall': 0.989010989010989, 'f1': 0.9836065573770493, 'number': 273} | 0.9834 | 0.9883 | 0.9859 | 0.9963 |
0.0151 | 7.0 | 77 | 0.0149 | {'precision': 0.9583333333333334, 'recall': 0.971830985915493, 'f1': 0.965034965034965, 'number': 71} | {'precision': 0.9921568627450981, 'recall': 0.98828125, 'f1': 0.990215264187867, 'number': 256} | {'precision': 0.9817518248175182, 'recall': 0.9853479853479854, 'f1': 0.9835466179159049, 'number': 273} | 0.9834 | 0.985 | 0.9842 | 0.9968 |
0.0136 | 8.0 | 88 | 0.0142 | {'precision': 0.9583333333333334, 'recall': 0.971830985915493, 'f1': 0.965034965034965, 'number': 71} | {'precision': 0.98828125, 'recall': 0.98828125, 'f1': 0.98828125, 'number': 256} | {'precision': 0.9781818181818182, 'recall': 0.9853479853479854, 'f1': 0.9817518248175183, 'number': 273} | 0.9801 | 0.985 | 0.9825 | 0.9965 |
0.0136 | 9.0 | 99 | 0.0148 | {'precision': 0.971830985915493, 'recall': 0.971830985915493, 'f1': 0.971830985915493, 'number': 71} | {'precision': 0.9921568627450981, 'recall': 0.98828125, 'f1': 0.990215264187867, 'number': 256} | {'precision': 0.9818181818181818, 'recall': 0.989010989010989, 'f1': 0.9854014598540145, 'number': 273} | 0.9850 | 0.9867 | 0.9858 | 0.9963 |
0.0103 | 10.0 | 110 | 0.0138 | {'precision': 0.9859154929577465, 'recall': 0.9859154929577465, 'f1': 0.9859154929577465, 'number': 71} | {'precision': 0.9921568627450981, 'recall': 0.98828125, 'f1': 0.990215264187867, 'number': 256} | {'precision': 0.9854545454545455, 'recall': 0.9926739926739927, 'f1': 0.989051094890511, 'number': 273} | 0.9884 | 0.99 | 0.9892 | 0.9965 |
0.0091 | 11.0 | 121 | 0.0136 | {'precision': 0.971830985915493, 'recall': 0.971830985915493, 'f1': 0.971830985915493, 'number': 71} | {'precision': 0.9921568627450981, 'recall': 0.98828125, 'f1': 0.990215264187867, 'number': 256} | {'precision': 0.9854014598540146, 'recall': 0.989010989010989, 'f1': 0.9872029250457038, 'number': 273} | 0.9867 | 0.9867 | 0.9867 | 0.9968 |
0.0081 | 12.0 | 132 | 0.0131 | {'precision': 0.9859154929577465, 'recall': 0.9859154929577465, 'f1': 0.9859154929577465, 'number': 71} | {'precision': 0.98828125, 'recall': 0.98828125, 'f1': 0.98828125, 'number': 256} | {'precision': 0.9890510948905109, 'recall': 0.9926739926739927, 'f1': 0.9908592321755026, 'number': 273} | 0.9884 | 0.99 | 0.9892 | 0.9973 |
0.0098 | 13.0 | 143 | 0.0136 | {'precision': 0.971830985915493, 'recall': 0.971830985915493, 'f1': 0.971830985915493, 'number': 71} | {'precision': 0.9921568627450981, 'recall': 0.98828125, 'f1': 0.990215264187867, 'number': 256} | {'precision': 0.9854014598540146, 'recall': 0.989010989010989, 'f1': 0.9872029250457038, 'number': 273} | 0.9867 | 0.9867 | 0.9867 | 0.9968 |
0.0066 | 14.0 | 154 | 0.0139 | {'precision': 0.971830985915493, 'recall': 0.971830985915493, 'f1': 0.971830985915493, 'number': 71} | {'precision': 0.9921568627450981, 'recall': 0.98828125, 'f1': 0.990215264187867, 'number': 256} | {'precision': 0.9818181818181818, 'recall': 0.989010989010989, 'f1': 0.9854014598540145, 'number': 273} | 0.9850 | 0.9867 | 0.9858 | 0.9965 |
0.007 | 15.0 | 165 | 0.0137 | {'precision': 0.971830985915493, 'recall': 0.971830985915493, 'f1': 0.971830985915493, 'number': 71} | {'precision': 0.9921568627450981, 'recall': 0.98828125, 'f1': 0.990215264187867, 'number': 256} | {'precision': 0.9818181818181818, 'recall': 0.989010989010989, 'f1': 0.9854014598540145, 'number': 273} | 0.9850 | 0.9867 | 0.9858 | 0.9965 |
Framework versions
- Transformers 4.49.0
- Pytorch 2.6.0+cu124
- Datasets 3.3.2
- Tokenizers 0.21.0
- Downloads last month
- 36
Inference Providers
NEW
This model is not currently available via any of the supported Inference Providers.
Model tree for pabloma09/layoutlm-FUNSDxSynthetic-5fold
Base model
microsoft/layoutlm-base-uncased