metadata
library_name: transformers
license: cc-by-nc-sa-4.0
base_model: microsoft/layoutlmv3-base
tags:
- generated_from_trainer
metrics:
- accuracy
- f1
model-index:
- name: layoutlmv3_document_classification
results: []
layoutlmv3_document_classification
This model is a fine-tuned version of microsoft/layoutlmv3-base on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.0329
- Accuracy: 0.8070
- F1: 0.7806
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-06
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- optimizer: Use OptimizerNames.ADAMW_TORCH with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
- lr_scheduler_type: linear
- lr_scheduler_warmup_steps: 500
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Accuracy | F1 |
---|---|---|---|---|---|
0.9913 | 0.5587 | 100 | 1.2412 | 0.7790 | 0.7459 |
0.9127 | 1.1173 | 200 | 1.2323 | 0.7804 | 0.7466 |
0.9307 | 1.6760 | 300 | 1.2167 | 0.7874 | 0.7550 |
0.8986 | 2.2346 | 400 | 1.2004 | 0.7888 | 0.7585 |
0.8744 | 2.7933 | 500 | 1.1751 | 0.7860 | 0.7572 |
0.7765 | 3.3520 | 600 | 1.1382 | 0.7874 | 0.7584 |
0.7695 | 3.9106 | 700 | 1.1275 | 0.7930 | 0.7642 |
0.7158 | 4.4693 | 800 | 1.0994 | 0.8056 | 0.7788 |
0.7015 | 5.0279 | 900 | 1.0844 | 0.7902 | 0.7619 |
0.6286 | 5.5866 | 1000 | 1.0828 | 0.8014 | 0.7728 |
0.693 | 6.1453 | 1100 | 1.0715 | 0.8042 | 0.7768 |
0.6291 | 6.7039 | 1200 | 1.0549 | 0.8070 | 0.7822 |
0.6422 | 7.2626 | 1300 | 1.0502 | 0.8168 | 0.7924 |
0.609 | 7.8212 | 1400 | 1.0368 | 0.8084 | 0.7828 |
0.6022 | 8.3799 | 1500 | 1.0407 | 0.8098 | 0.7824 |
0.5783 | 8.9385 | 1600 | 1.0343 | 0.8112 | 0.7843 |
0.5635 | 9.4972 | 1700 | 1.0329 | 0.8070 | 0.7806 |
Framework versions
- Transformers 4.47.1
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0