Ifyouknowthenyouknow commited on
Commit
58ac7f2
1 Parent(s): 443d8db

End of training

Browse files
README.md CHANGED
@@ -17,14 +17,14 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on the funsd-layoutlmv3 dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 1.5541
21
- - Answer: {'precision': 0.8696655132641292, 'recall': 0.9228886168910648, 'f1': 0.8954869358669834, 'number': 817}
22
- - Header: {'precision': 0.6559139784946236, 'recall': 0.5126050420168067, 'f1': 0.5754716981132076, 'number': 119}
23
- - Question: {'precision': 0.9031963470319635, 'recall': 0.9182915506035283, 'f1': 0.9106813996316759, 'number': 1077}
24
- - Overall Precision: 0.8779
25
  - Overall Recall: 0.8962
26
- - Overall F1: 0.8869
27
- - Overall Accuracy: 0.8228
28
 
29
  ## Model description
30
 
@@ -54,20 +54,20 @@ The following hyperparameters were used during training:
54
 
55
  ### Training results
56
 
57
- | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
58
- |:-------------:|:------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
59
- | 0.4364 | 10.53 | 200 | 0.9844 | {'precision': 0.834106728538283, 'recall': 0.8800489596083231, 'f1': 0.8564621798689696, 'number': 817} | {'precision': 0.41935483870967744, 'recall': 0.6554621848739496, 'f1': 0.5114754098360655, 'number': 119} | {'precision': 0.8516833484986351, 'recall': 0.8690807799442897, 'f1': 0.860294117647059, 'number': 1077} | 0.8072 | 0.8609 | 0.8332 | 0.7792 |
60
- | 0.0425 | 21.05 | 400 | 1.1086 | {'precision': 0.8603550295857988, 'recall': 0.8898408812729498, 'f1': 0.8748495788206979, 'number': 817} | {'precision': 0.6288659793814433, 'recall': 0.5126050420168067, 'f1': 0.5648148148148148, 'number': 119} | {'precision': 0.8763250883392226, 'recall': 0.9210770659238626, 'f1': 0.8981439565414214, 'number': 1077} | 0.8582 | 0.8843 | 0.8711 | 0.8199 |
61
- | 0.0132 | 31.58 | 600 | 1.3613 | {'precision': 0.8785276073619632, 'recall': 0.8763769889840881, 'f1': 0.8774509803921567, 'number': 817} | {'precision': 0.5182481751824818, 'recall': 0.5966386554621849, 'f1': 0.5546875, 'number': 119} | {'precision': 0.8656195462478184, 'recall': 0.9210770659238626, 'f1': 0.8924876293297346, 'number': 1077} | 0.8480 | 0.8838 | 0.8655 | 0.8090 |
62
- | 0.0061 | 42.11 | 800 | 1.5515 | {'precision': 0.8825, 'recall': 0.8641370869033048, 'f1': 0.8732220160791588, 'number': 817} | {'precision': 0.5689655172413793, 'recall': 0.5546218487394958, 'f1': 0.5617021276595745, 'number': 119} | {'precision': 0.8888888888888888, 'recall': 0.8987929433611885, 'f1': 0.8938134810710988, 'number': 1077} | 0.8678 | 0.8644 | 0.8661 | 0.7964 |
63
- | 0.0041 | 52.63 | 1000 | 1.5132 | {'precision': 0.8808664259927798, 'recall': 0.8959608323133414, 'f1': 0.8883495145631067, 'number': 817} | {'precision': 0.6296296296296297, 'recall': 0.5714285714285714, 'f1': 0.5991189427312775, 'number': 119} | {'precision': 0.8793256433007985, 'recall': 0.9201485608170845, 'f1': 0.8992740471869328, 'number': 1077} | 0.8669 | 0.8897 | 0.8782 | 0.8021 |
64
- | 0.0023 | 63.16 | 1200 | 1.6099 | {'precision': 0.8483466362599772, 'recall': 0.9106487148102815, 'f1': 0.8783943329397875, 'number': 817} | {'precision': 0.6470588235294118, 'recall': 0.46218487394957986, 'f1': 0.5392156862745099, 'number': 119} | {'precision': 0.8795718108831401, 'recall': 0.9155060352831941, 'f1': 0.897179253867152, 'number': 1077} | 0.8569 | 0.8867 | 0.8716 | 0.8007 |
65
- | 0.0013 | 73.68 | 1400 | 1.5668 | {'precision': 0.8819277108433735, 'recall': 0.8959608323133414, 'f1': 0.8888888888888888, 'number': 817} | {'precision': 0.5775862068965517, 'recall': 0.5630252100840336, 'f1': 0.5702127659574467, 'number': 119} | {'precision': 0.8972477064220183, 'recall': 0.9080779944289693, 'f1': 0.9026303645592985, 'number': 1077} | 0.8728 | 0.8828 | 0.8777 | 0.8079 |
66
- | 0.0009 | 84.21 | 1600 | 1.7323 | {'precision': 0.8639534883720931, 'recall': 0.9094247246022031, 'f1': 0.8861061419200955, 'number': 817} | {'precision': 0.6039603960396039, 'recall': 0.5126050420168067, 'f1': 0.5545454545454545, 'number': 119} | {'precision': 0.8962962962962963, 'recall': 0.8987929433611885, 'f1': 0.8975428836346777, 'number': 1077} | 0.8682 | 0.8803 | 0.8742 | 0.8078 |
67
- | 0.0008 | 94.74 | 1800 | 1.5326 | {'precision': 0.8741258741258742, 'recall': 0.9179926560587516, 'f1': 0.8955223880597015, 'number': 817} | {'precision': 0.6226415094339622, 'recall': 0.5546218487394958, 'f1': 0.5866666666666668, 'number': 119} | {'precision': 0.9044117647058824, 'recall': 0.9136490250696379, 'f1': 0.9090069284064666, 'number': 1077} | 0.8772 | 0.8942 | 0.8856 | 0.8208 |
68
- | 0.0003 | 105.26 | 2000 | 1.5560 | {'precision': 0.8625429553264605, 'recall': 0.9216646266829865, 'f1': 0.8911242603550296, 'number': 817} | {'precision': 0.616822429906542, 'recall': 0.5546218487394958, 'f1': 0.5840707964601769, 'number': 119} | {'precision': 0.9008264462809917, 'recall': 0.9108635097493036, 'f1': 0.9058171745152355, 'number': 1077} | 0.8700 | 0.8942 | 0.8819 | 0.8189 |
69
- | 0.0002 | 115.79 | 2200 | 1.5541 | {'precision': 0.8696655132641292, 'recall': 0.9228886168910648, 'f1': 0.8954869358669834, 'number': 817} | {'precision': 0.6559139784946236, 'recall': 0.5126050420168067, 'f1': 0.5754716981132076, 'number': 119} | {'precision': 0.9031963470319635, 'recall': 0.9182915506035283, 'f1': 0.9106813996316759, 'number': 1077} | 0.8779 | 0.8962 | 0.8869 | 0.8228 |
70
- | 0.0002 | 126.32 | 2400 | 1.5664 | {'precision': 0.8670520231213873, 'recall': 0.9179926560587516, 'f1': 0.89179548156956, 'number': 817} | {'precision': 0.6595744680851063, 'recall': 0.5210084033613446, 'f1': 0.5821596244131456, 'number': 119} | {'precision': 0.9135687732342007, 'recall': 0.9127205199628597, 'f1': 0.913144449605202, 'number': 1077} | 0.8821 | 0.8917 | 0.8869 | 0.8262 |
71
 
72
 
73
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [SCUT-DLVCLab/lilt-roberta-en-base](https://huggingface.co/SCUT-DLVCLab/lilt-roberta-en-base) on the funsd-layoutlmv3 dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 1.5815
21
+ - Answer: {'precision': 0.8604118993135011, 'recall': 0.9204406364749081, 'f1': 0.8894145476049675, 'number': 817}
22
+ - Header: {'precision': 0.6330275229357798, 'recall': 0.5798319327731093, 'f1': 0.6052631578947367, 'number': 119}
23
+ - Question: {'precision': 0.9101851851851852, 'recall': 0.9127205199628597, 'f1': 0.9114510894761243, 'number': 1077}
24
+ - Overall Precision: 0.8745
25
  - Overall Recall: 0.8962
26
+ - Overall F1: 0.8852
27
+ - Overall Accuracy: 0.8209
28
 
29
  ## Model description
30
 
 
54
 
55
  ### Training results
56
 
57
+ | Training Loss | Epoch | Step | Validation Loss | Answer | Header | Question | Overall Precision | Overall Recall | Overall F1 | Overall Accuracy |
58
+ |:-------------:|:------:|:----:|:---------------:|:--------------------------------------------------------------------------------------------------------:|:--------------------------------------------------------------------------------------------------------:|:---------------------------------------------------------------------------------------------------------:|:-----------------:|:--------------:|:----------:|:----------------:|
59
+ | 0.4131 | 10.53 | 200 | 0.9920 | {'precision': 0.7944444444444444, 'recall': 0.8751529987760098, 'f1': 0.8328479906814211, 'number': 817} | {'precision': 0.5267857142857143, 'recall': 0.4957983193277311, 'f1': 0.5108225108225107, 'number': 119} | {'precision': 0.8690265486725663, 'recall': 0.9117920148560817, 'f1': 0.8898957861350248, 'number': 1077} | 0.8198 | 0.8723 | 0.8452 | 0.7912 |
60
+ | 0.0453 | 21.05 | 400 | 1.3055 | {'precision': 0.8215077605321508, 'recall': 0.9069767441860465, 'f1': 0.8621291448516578, 'number': 817} | {'precision': 0.5961538461538461, 'recall': 0.5210084033613446, 'f1': 0.5560538116591929, 'number': 119} | {'precision': 0.8818755635707844, 'recall': 0.9080779944289693, 'f1': 0.8947849954254347, 'number': 1077} | 0.8421 | 0.8847 | 0.8629 | 0.7971 |
61
+ | 0.0129 | 31.58 | 600 | 1.6559 | {'precision': 0.8261826182618262, 'recall': 0.9192166462668299, 'f1': 0.8702201622247971, 'number': 817} | {'precision': 0.4957983193277311, 'recall': 0.4957983193277311, 'f1': 0.4957983193277311, 'number': 119} | {'precision': 0.9050814956855225, 'recall': 0.8765088207985144, 'f1': 0.8905660377358492, 'number': 1077} | 0.8469 | 0.8713 | 0.8590 | 0.7952 |
62
+ | 0.0083 | 42.11 | 800 | 1.6136 | {'precision': 0.8760529482551144, 'recall': 0.8910648714810282, 'f1': 0.883495145631068, 'number': 817} | {'precision': 0.6145833333333334, 'recall': 0.4957983193277311, 'f1': 0.5488372093023256, 'number': 119} | {'precision': 0.8963922294172063, 'recall': 0.8997214484679665, 'f1': 0.8980537534754401, 'number': 1077} | 0.8745 | 0.8723 | 0.8734 | 0.8060 |
63
+ | 0.0058 | 52.63 | 1000 | 1.6826 | {'precision': 0.8553386911595867, 'recall': 0.9118727050183598, 'f1': 0.8827014218009479, 'number': 817} | {'precision': 0.6355140186915887, 'recall': 0.5714285714285714, 'f1': 0.6017699115044248, 'number': 119} | {'precision': 0.8902991840435177, 'recall': 0.9117920148560817, 'f1': 0.9009174311926607, 'number': 1077} | 0.8626 | 0.8917 | 0.8769 | 0.7928 |
64
+ | 0.0027 | 63.16 | 1200 | 1.5511 | {'precision': 0.8640661938534279, 'recall': 0.8947368421052632, 'f1': 0.8791340950090198, 'number': 817} | {'precision': 0.576, 'recall': 0.6050420168067226, 'f1': 0.5901639344262294, 'number': 119} | {'precision': 0.8985374771480804, 'recall': 0.9127205199628597, 'f1': 0.9055734684477199, 'number': 1077} | 0.8649 | 0.8872 | 0.8759 | 0.8110 |
65
+ | 0.0014 | 73.68 | 1400 | 1.5130 | {'precision': 0.8801452784503632, 'recall': 0.8898408812729498, 'f1': 0.8849665246500303, 'number': 817} | {'precision': 0.6213592233009708, 'recall': 0.5378151260504201, 'f1': 0.5765765765765765, 'number': 119} | {'precision': 0.8748906386701663, 'recall': 0.9285051067780873, 'f1': 0.900900900900901, 'number': 1077} | 0.8644 | 0.8897 | 0.8769 | 0.8092 |
66
+ | 0.001 | 84.21 | 1600 | 1.5433 | {'precision': 0.8373893805309734, 'recall': 0.9265605875152999, 'f1': 0.8797210923881464, 'number': 817} | {'precision': 0.6033057851239669, 'recall': 0.6134453781512605, 'f1': 0.6083333333333334, 'number': 119} | {'precision': 0.9138257575757576, 'recall': 0.8960074280408542, 'f1': 0.9048288795124239, 'number': 1077} | 0.8626 | 0.8917 | 0.8769 | 0.8139 |
67
+ | 0.0006 | 94.74 | 1800 | 1.5585 | {'precision': 0.8500576701268743, 'recall': 0.9020807833537332, 'f1': 0.8752969121140143, 'number': 817} | {'precision': 0.6371681415929203, 'recall': 0.6050420168067226, 'f1': 0.6206896551724138, 'number': 119} | {'precision': 0.8933454876937101, 'recall': 0.9099350046425255, 'f1': 0.9015639374425023, 'number': 1077} | 0.8613 | 0.8887 | 0.8748 | 0.8197 |
68
+ | 0.0003 | 105.26 | 2000 | 1.5719 | {'precision': 0.8505096262740657, 'recall': 0.9192166462668299, 'f1': 0.8835294117647059, 'number': 817} | {'precision': 0.6605504587155964, 'recall': 0.6050420168067226, 'f1': 0.6315789473684209, 'number': 119} | {'precision': 0.9113805970149254, 'recall': 0.9071494893221913, 'f1': 0.9092601209865054, 'number': 1077} | 0.8721 | 0.8942 | 0.8830 | 0.8246 |
69
+ | 0.0004 | 115.79 | 2200 | 1.5578 | {'precision': 0.8554913294797688, 'recall': 0.9057527539779682, 'f1': 0.8799048751486326, 'number': 817} | {'precision': 0.6283185840707964, 'recall': 0.5966386554621849, 'f1': 0.6120689655172413, 'number': 119} | {'precision': 0.9059907834101383, 'recall': 0.9127205199628597, 'f1': 0.9093432007400555, 'number': 1077} | 0.8696 | 0.8912 | 0.8803 | 0.8194 |
70
+ | 0.0003 | 126.32 | 2400 | 1.5815 | {'precision': 0.8604118993135011, 'recall': 0.9204406364749081, 'f1': 0.8894145476049675, 'number': 817} | {'precision': 0.6330275229357798, 'recall': 0.5798319327731093, 'f1': 0.6052631578947367, 'number': 119} | {'precision': 0.9101851851851852, 'recall': 0.9127205199628597, 'f1': 0.9114510894761243, 'number': 1077} | 0.8745 | 0.8962 | 0.8852 | 0.8209 |
71
 
72
 
73
  ### Framework versions
logs/events.out.tfevents.1701950878.2e154ff83e8a.428.0 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:f2a306150c6869baa2049cc9c9004ff976d175b5bd494f46b3adb0fee5063ec8
3
- size 12538
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:ae5c554ef855c38c4733e917581cbe3fcb4463f107464658369f14174051cfde
3
+ size 12892
logs/events.out.tfevents.1701952298.2e154ff83e8a.428.1 ADDED
@@ -0,0 +1,3 @@
 
 
 
 
1
+ version https://git-lfs.github.com/spec/v1
2
+ oid sha256:d2a94712946b3aad740f45cec35b87f92f7f5f36a4208248e6402e8afe3d3023
3
+ size 592