pabloma09 commited on
Commit
1b56ddb
·
verified ·
1 Parent(s): 74d0d43

End of training

Browse files
README.md CHANGED
@@ -16,14 +16,14 @@ should probably proofread and complete it, then remove this comment. -->
16
 
17
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
- - Loss: 0.6076
20
- - Eader: {'precision': 0.43333333333333335, 'recall': 0.3132530120481928, 'f1': 0.36363636363636365, 'number': 83}
21
- - Nswer: {'precision': 0.4857142857142857, 'recall': 0.5804878048780487, 'f1': 0.5288888888888889, 'number': 205}
22
- - Uestion: {'precision': 0.358695652173913, 'recall': 0.42857142857142855, 'f1': 0.3905325443786982, 'number': 231}
23
- - Overall Precision: 0.4200
24
- - Overall Recall: 0.4701
25
- - Overall F1: 0.4436
26
- - Overall Accuracy: 0.7970
27
 
28
  ## Model description
29
 
@@ -53,17 +53,17 @@ The following hyperparameters were used during training:
53
 
54
  ### Training results
55
 
56
- | Training Loss | Epoch | Step | Validation Loss | Eader | Nswer | Uestion | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
57
- |:-------------:|:-----:|:----:|:---------------:|:---------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
58
- | 1.3184 | 1.0 | 12 | 1.0718 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 83} | {'precision': 0.0707482993197279, 'recall': 0.25365853658536586, 'f1': 0.11063829787234042, 'number': 205} | {'precision': 0.09251700680272108, 'recall': 0.2943722943722944, 'f1': 0.14078674948240166, 'number': 231} | 0.0816 | 0.2312 | 0.1207 | 0.6133 |
59
- | 0.9674 | 2.0 | 24 | 0.7899 | {'precision': 0.16, 'recall': 0.04819277108433735, 'f1': 0.07407407407407407, 'number': 83} | {'precision': 0.23114355231143552, 'recall': 0.4634146341463415, 'f1': 0.30844155844155846, 'number': 205} | {'precision': 0.22518159806295399, 'recall': 0.4025974025974026, 'f1': 0.2888198757763975, 'number': 231} | 0.2261 | 0.3699 | 0.2807 | 0.7268 |
60
- | 0.7106 | 3.0 | 36 | 0.6643 | {'precision': 0.3181818181818182, 'recall': 0.1686746987951807, 'f1': 0.2204724409448819, 'number': 83} | {'precision': 0.40892193308550184, 'recall': 0.5365853658536586, 'f1': 0.4641350210970464, 'number': 205} | {'precision': 0.3263157894736842, 'recall': 0.4025974025974026, 'f1': 0.36046511627906974, 'number': 231} | 0.3629 | 0.4181 | 0.3885 | 0.7812 |
61
- | 0.5308 | 4.0 | 48 | 0.6111 | {'precision': 0.3125, 'recall': 0.24096385542168675, 'f1': 0.27210884353741494, 'number': 83} | {'precision': 0.4338235294117647, 'recall': 0.5756097560975609, 'f1': 0.4947589098532495, 'number': 205} | {'precision': 0.33448275862068966, 'recall': 0.4199134199134199, 'f1': 0.3723608445297505, 'number': 231} | 0.3754 | 0.4528 | 0.4105 | 0.7867 |
62
- | 0.4626 | 5.0 | 60 | 0.5787 | {'precision': 0.4230769230769231, 'recall': 0.26506024096385544, 'f1': 0.32592592592592595, 'number': 83} | {'precision': 0.4580152671755725, 'recall': 0.5853658536585366, 'f1': 0.5139186295503212, 'number': 205} | {'precision': 0.34657039711191334, 'recall': 0.4155844155844156, 'f1': 0.3779527559055118, 'number': 231} | 0.4027 | 0.4586 | 0.4288 | 0.7938 |
63
- | 0.3703 | 6.0 | 72 | 0.5845 | {'precision': 0.4339622641509434, 'recall': 0.27710843373493976, 'f1': 0.3382352941176471, 'number': 83} | {'precision': 0.46360153256704983, 'recall': 0.5902439024390244, 'f1': 0.5193133047210301, 'number': 205} | {'precision': 0.3506944444444444, 'recall': 0.43722943722943725, 'f1': 0.3892100192678227, 'number': 231} | 0.4070 | 0.4721 | 0.4371 | 0.8011 |
64
- | 0.33 | 7.0 | 84 | 0.6011 | {'precision': 0.45614035087719296, 'recall': 0.3132530120481928, 'f1': 0.37142857142857144, 'number': 83} | {'precision': 0.4878048780487805, 'recall': 0.5853658536585366, 'f1': 0.532150776053215, 'number': 205} | {'precision': 0.37362637362637363, 'recall': 0.44155844155844154, 'f1': 0.4047619047619048, 'number': 231} | 0.4306 | 0.4778 | 0.4530 | 0.7934 |
65
- | 0.2903 | 8.0 | 96 | 0.6063 | {'precision': 0.4727272727272727, 'recall': 0.3132530120481928, 'f1': 0.3768115942028985, 'number': 83} | {'precision': 0.5, 'recall': 0.5853658536585366, 'f1': 0.5393258426966292, 'number': 205} | {'precision': 0.36900369003690037, 'recall': 0.4329004329004329, 'f1': 0.398406374501992, 'number': 231} | 0.4346 | 0.4740 | 0.4535 | 0.7972 |
66
- | 0.2723 | 9.0 | 108 | 0.6076 | {'precision': 0.43333333333333335, 'recall': 0.3132530120481928, 'f1': 0.36363636363636365, 'number': 83} | {'precision': 0.4857142857142857, 'recall': 0.5804878048780487, 'f1': 0.5288888888888889, 'number': 205} | {'precision': 0.358695652173913, 'recall': 0.42857142857142855, 'f1': 0.3905325443786982, 'number': 231} | 0.4200 | 0.4701 | 0.4436 | 0.7970 |
67
 
68
 
69
  ### Framework versions
 
16
 
17
  This model is a fine-tuned version of [microsoft/layoutlm-base-uncased](https://huggingface.co/microsoft/layoutlm-base-uncased) on the None dataset.
18
  It achieves the following results on the evaluation set:
19
+ - Loss: 0.6138
20
+ - Eader: {'precision': 0.4098360655737705, 'recall': 0.30120481927710846, 'f1': 0.34722222222222227, 'number': 83}
21
+ - Nswer: {'precision': 0.45525291828793774, 'recall': 0.5707317073170731, 'f1': 0.5064935064935064, 'number': 205}
22
+ - Uestion: {'precision': 0.3793103448275862, 'recall': 0.42857142857142855, 'f1': 0.4024390243902439, 'number': 231}
23
+ - Overall Precision: 0.4162
24
+ - Overall Recall: 0.4644
25
+ - Overall F1: 0.4390
26
+ - Overall Accuracy: 0.7750
27
 
28
  ## Model description
29
 
 
53
 
54
  ### Training results
55
 
56
+ | Training Loss | Epoch | Step | Validation Loss | Eader | Nswer | Uestion | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
57
+ |:-------------:|:-----:|:----:|:---------------:|:-----------------------------------------------------------------------------------------------------------:|:-----------------------------------------------------------------------------------------------------------:|:----------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
58
+ | 1.198 | 1.0 | 12 | 1.0274 | {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 83} | {'precision': 0.11785714285714285, 'recall': 0.32195121951219513, 'f1': 0.17254901960784313, 'number': 205} | {'precision': 0.11732851985559567, 'recall': 0.2813852813852814, 'f1': 0.16560509554140126, 'number': 231} | 0.1176 | 0.2524 | 0.1604 | 0.6381 |
59
+ | 0.9302 | 2.0 | 24 | 0.7826 | {'precision': 0.16666666666666666, 'recall': 0.012048192771084338, 'f1': 0.02247191011235955, 'number': 83} | {'precision': 0.21844660194174756, 'recall': 0.43902439024390244, 'f1': 0.2917341977309562, 'number': 205} | {'precision': 0.2109375, 'recall': 0.35064935064935066, 'f1': 0.2634146341463415, 'number': 231} | 0.2145 | 0.3314 | 0.2604 | 0.7183 |
60
+ | 0.7111 | 3.0 | 36 | 0.6407 | {'precision': 0.1794871794871795, 'recall': 0.08433734939759036, 'f1': 0.11475409836065575, 'number': 83} | {'precision': 0.3432343234323432, 'recall': 0.5073170731707317, 'f1': 0.40944881889763785, 'number': 205} | {'precision': 0.303886925795053, 'recall': 0.3722943722943723, 'f1': 0.3346303501945525, 'number': 231} | 0.3152 | 0.3796 | 0.3444 | 0.7782 |
61
+ | 0.5314 | 4.0 | 48 | 0.6422 | {'precision': 0.21666666666666667, 'recall': 0.1566265060240964, 'f1': 0.18181818181818182, 'number': 83} | {'precision': 0.3985239852398524, 'recall': 0.526829268292683, 'f1': 0.45378151260504207, 'number': 205} | {'precision': 0.373015873015873, 'recall': 0.4069264069264069, 'f1': 0.38923395445134573, 'number': 231} | 0.3688 | 0.4143 | 0.3902 | 0.7626 |
62
+ | 0.4782 | 5.0 | 60 | 0.5865 | {'precision': 0.3114754098360656, 'recall': 0.2289156626506024, 'f1': 0.2638888888888889, 'number': 83} | {'precision': 0.4036363636363636, 'recall': 0.5414634146341464, 'f1': 0.4625000000000001, 'number': 205} | {'precision': 0.336996336996337, 'recall': 0.39826839826839827, 'f1': 0.3650793650793651, 'number': 231} | 0.3645 | 0.4277 | 0.3936 | 0.7784 |
63
+ | 0.3789 | 6.0 | 72 | 0.6069 | {'precision': 0.3220338983050847, 'recall': 0.2289156626506024, 'f1': 0.2676056338028169, 'number': 83} | {'precision': 0.4367816091954023, 'recall': 0.5560975609756098, 'f1': 0.4892703862660945, 'number': 205} | {'precision': 0.37401574803149606, 'recall': 0.41125541125541126, 'f1': 0.3917525773195876, 'number': 231} | 0.3972 | 0.4393 | 0.4172 | 0.7696 |
64
+ | 0.3423 | 7.0 | 84 | 0.6048 | {'precision': 0.375, 'recall': 0.25301204819277107, 'f1': 0.3021582733812949, 'number': 83} | {'precision': 0.42911877394636017, 'recall': 0.5463414634146342, 'f1': 0.48068669527896996, 'number': 205} | {'precision': 0.39215686274509803, 'recall': 0.4329004329004329, 'f1': 0.411522633744856, 'number': 231} | 0.4073 | 0.4489 | 0.4271 | 0.7782 |
65
+ | 0.2995 | 8.0 | 96 | 0.6146 | {'precision': 0.3709677419354839, 'recall': 0.27710843373493976, 'f1': 0.31724137931034485, 'number': 83} | {'precision': 0.43346007604562736, 'recall': 0.5560975609756098, 'f1': 0.48717948717948717, 'number': 205} | {'precision': 0.3787878787878788, 'recall': 0.4329004329004329, 'f1': 0.40404040404040403, 'number': 231} | 0.4024 | 0.4566 | 0.4278 | 0.7758 |
66
+ | 0.2774 | 9.0 | 108 | 0.6138 | {'precision': 0.4098360655737705, 'recall': 0.30120481927710846, 'f1': 0.34722222222222227, 'number': 83} | {'precision': 0.45525291828793774, 'recall': 0.5707317073170731, 'f1': 0.5064935064935064, 'number': 205} | {'precision': 0.3793103448275862, 'recall': 0.42857142857142855, 'f1': 0.4024390243902439, 'number': 231} | 0.4162 | 0.4644 | 0.4390 | 0.7750 |
67
 
68
 
69
  ### Framework versions
logs/events.out.tfevents.1741606152.DESKTOP-HA84SVN.94162.9 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:50fb6d210f86edd4e71ae57e224e614213950796538ce6859b439073b6238c38
3
- size 11047
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:c27c63e6226a2097bb4c4358d401840b45739a4a4945371bcabc8eda6c7016bf
3
+ size 11890