layoutlm-funsd / README.md
lmurray's picture
End of training
16b15ee verified
|
raw
history blame
9.35 kB
metadata
license: mit
base_model: microsoft/layoutlm-base-uncased
tags:
  - generated_from_trainer
datasets:
  - funsd
model-index:
  - name: layoutlm-funsd
    results: []

layoutlm-funsd

This model is a fine-tuned version of microsoft/layoutlm-base-uncased on the funsd dataset. It achieves the following results on the evaluation set:

  • Loss: 0.7271
  • Answer: {'precision': 0.7209821428571429, 'recall': 0.7985166872682324, 'f1': 0.7577712609970675, 'number': 809}
  • Header: {'precision': 0.3308270676691729, 'recall': 0.3697478991596639, 'f1': 0.3492063492063492, 'number': 119}
  • Question: {'precision': 0.7732049036777583, 'recall': 0.8291079812206573, 'f1': 0.8001812415043046, 'number': 1065}
  • Overall Precision: 0.7246
  • Overall Recall: 0.7893
  • Overall F1: 0.7555
  • Overall Accuracy: 0.8054

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 3e-05
  • train_batch_size: 16
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 15
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Answer Header Question Overall Precision Overall Recall Overall F1 Overall Accuracy
1.8393 1.0 10 1.5994 {'precision': 0.02424942263279446, 'recall': 0.02595797280593325, 'f1': 0.02507462686567164, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.22250639386189258, 'recall': 0.16338028169014085, 'f1': 0.18841364374661615, 'number': 1065} 0.1183 0.0978 0.1071 0.3781
1.4614 2.0 20 1.2520 {'precision': 0.121765601217656, 'recall': 0.09888751545117429, 'f1': 0.10914051841746247, 'number': 809} {'precision': 0.0, 'recall': 0.0, 'f1': 0.0, 'number': 119} {'precision': 0.47543713572023316, 'recall': 0.536150234741784, 'f1': 0.503971756398941, 'number': 1065} 0.3504 0.3266 0.3381 0.5844
1.1144 3.0 30 0.9610 {'precision': 0.5, 'recall': 0.5278121137206427, 'f1': 0.513529765484065, 'number': 809} {'precision': 0.037037037037037035, 'recall': 0.008403361344537815, 'f1': 0.0136986301369863, 'number': 119} {'precision': 0.6224758560140474, 'recall': 0.6657276995305165, 'f1': 0.6433756805807623, 'number': 1065} 0.5629 0.5705 0.5667 0.7246
0.8509 4.0 40 0.7961 {'precision': 0.5943396226415094, 'recall': 0.7008652657601978, 'f1': 0.6432217810550198, 'number': 809} {'precision': 0.3, 'recall': 0.12605042016806722, 'f1': 0.17751479289940827, 'number': 119} {'precision': 0.6532188841201717, 'recall': 0.7145539906103286, 'f1': 0.6825112107623318, 'number': 1065} 0.6192 0.6739 0.6454 0.7543
0.6724 5.0 50 0.7346 {'precision': 0.6362683438155137, 'recall': 0.7503090234857849, 'f1': 0.6885989790130459, 'number': 809} {'precision': 0.3466666666666667, 'recall': 0.2184873949579832, 'f1': 0.26804123711340205, 'number': 119} {'precision': 0.6597444089456869, 'recall': 0.7755868544600939, 'f1': 0.7129909365558912, 'number': 1065} 0.6396 0.7321 0.6827 0.7797
0.5743 6.0 60 0.7086 {'precision': 0.6481481481481481, 'recall': 0.7787391841779975, 'f1': 0.7074677147669849, 'number': 809} {'precision': 0.3424657534246575, 'recall': 0.21008403361344538, 'f1': 0.2604166666666667, 'number': 119} {'precision': 0.7128116938950989, 'recall': 0.7784037558685446, 'f1': 0.7441651705565528, 'number': 1065} 0.6721 0.7446 0.7065 0.7845
0.4963 7.0 70 0.6881 {'precision': 0.6866952789699571, 'recall': 0.7911001236093943, 'f1': 0.7352096496266514, 'number': 809} {'precision': 0.30392156862745096, 'recall': 0.2605042016806723, 'f1': 0.28054298642533937, 'number': 119} {'precision': 0.7263249348392702, 'recall': 0.7849765258215963, 'f1': 0.7545126353790614, 'number': 1065} 0.6897 0.7561 0.7214 0.7926
0.4392 8.0 80 0.7116 {'precision': 0.6779487179487179, 'recall': 0.8170580964153276, 'f1': 0.741031390134529, 'number': 809} {'precision': 0.28431372549019607, 'recall': 0.24369747899159663, 'f1': 0.26244343891402716, 'number': 119} {'precision': 0.7322175732217573, 'recall': 0.8215962441314554, 'f1': 0.7743362831858407, 'number': 1065} 0.6888 0.7852 0.7339 0.7886
0.3945 9.0 90 0.7000 {'precision': 0.7060737527114967, 'recall': 0.8046971569839307, 'f1': 0.7521663778162911, 'number': 809} {'precision': 0.2920353982300885, 'recall': 0.2773109243697479, 'f1': 0.28448275862068967, 'number': 119} {'precision': 0.7502183406113537, 'recall': 0.8065727699530516, 'f1': 0.7773755656108599, 'number': 1065} 0.7078 0.7742 0.7395 0.7989
0.3825 10.0 100 0.7006 {'precision': 0.717607973421927, 'recall': 0.8009888751545118, 'f1': 0.7570093457943926, 'number': 809} {'precision': 0.29464285714285715, 'recall': 0.2773109243697479, 'f1': 0.28571428571428575, 'number': 119} {'precision': 0.7642418930762489, 'recall': 0.8187793427230047, 'f1': 0.7905711695376247, 'number': 1065} 0.7203 0.7792 0.7486 0.8053
0.327 11.0 110 0.7180 {'precision': 0.6969376979936642, 'recall': 0.8158220024721878, 'f1': 0.7517084282460138, 'number': 809} {'precision': 0.2975206611570248, 'recall': 0.3025210084033613, 'f1': 0.3, 'number': 119} {'precision': 0.7572898799313894, 'recall': 0.8291079812206573, 'f1': 0.7915732855221873, 'number': 1065} 0.7068 0.7923 0.7471 0.7969
0.3136 12.0 120 0.7147 {'precision': 0.7283950617283951, 'recall': 0.8022249690976514, 'f1': 0.7635294117647059, 'number': 809} {'precision': 0.3305084745762712, 'recall': 0.3277310924369748, 'f1': 0.32911392405063294, 'number': 119} {'precision': 0.7831111111111111, 'recall': 0.8272300469483568, 'f1': 0.8045662100456622, 'number': 1065} 0.7352 0.7873 0.7604 0.8059
0.2943 13.0 130 0.7297 {'precision': 0.7136659436008677, 'recall': 0.8133498145859085, 'f1': 0.7602541883304449, 'number': 809} {'precision': 0.34210526315789475, 'recall': 0.3277310924369748, 'f1': 0.33476394849785407, 'number': 119} {'precision': 0.7785588752196837, 'recall': 0.831924882629108, 'f1': 0.8043576940535634, 'number': 1065} 0.7282 0.7943 0.7598 0.8001
0.2727 14.0 140 0.7284 {'precision': 0.7275784753363229, 'recall': 0.8022249690976514, 'f1': 0.7630805408583187, 'number': 809} {'precision': 0.3333333333333333, 'recall': 0.3697478991596639, 'f1': 0.350597609561753, 'number': 119} {'precision': 0.7732049036777583, 'recall': 0.8291079812206573, 'f1': 0.8001812415043046, 'number': 1065} 0.7276 0.7908 0.7579 0.8046
0.2753 15.0 150 0.7271 {'precision': 0.7209821428571429, 'recall': 0.7985166872682324, 'f1': 0.7577712609970675, 'number': 809} {'precision': 0.3308270676691729, 'recall': 0.3697478991596639, 'f1': 0.3492063492063492, 'number': 119} {'precision': 0.7732049036777583, 'recall': 0.8291079812206573, 'f1': 0.8001812415043046, 'number': 1065} 0.7246 0.7893 0.7555 0.8054

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1