razmark's picture
End of training
2e0de02 verified
metadata
library_name: transformers
license: cc-by-nc-sa-4.0
base_model: microsoft/layoutlmv2-base-uncased
tags:
  - generated_from_trainer
model-index:
  - name: layoutlmv2_docvqa_first
    results: []

layoutlmv2_docvqa_first

This model is a fine-tuned version of microsoft/layoutlmv2-base-uncased on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 3.2572

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 2
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 1

Training results

Training Loss Epoch Step Validation Loss
3.8046 0.0221 10 4.6015
3.8232 0.0442 20 4.7682
3.7611 0.0664 30 4.6595
4.1572 0.0885 40 4.3268
4.1477 0.1106 50 4.0399
4.6312 0.1327 60 4.4170
4.2506 0.1549 70 3.8923
3.814 0.1770 80 4.2276
4.517 0.1991 90 4.2349
4.5446 0.2212 100 3.7886
4.482 0.2434 110 3.7871
4.0186 0.2655 120 3.8549
3.6858 0.2876 130 3.8050
3.826 0.3097 140 3.7714
3.6791 0.3319 150 4.1984
4.1924 0.3540 160 3.7813
3.8497 0.3761 170 3.6755
3.6868 0.3982 180 3.8626
4.4311 0.4204 190 3.6253
3.7333 0.4425 200 3.5505
3.971 0.4646 210 3.6746
3.5623 0.4867 220 3.4512
3.3989 0.5088 230 3.3751
3.5899 0.5310 240 3.3715
2.9724 0.5531 250 3.5085
4.0829 0.5752 260 3.4023
3.4239 0.5973 270 3.2687
3.3833 0.6195 280 3.4527
4.1309 0.6416 290 3.5902
3.7569 0.6637 300 3.3229
3.5462 0.6858 310 3.3342
3.4489 0.7080 320 3.3608
2.9949 0.7301 330 3.3340
3.3155 0.7522 340 3.3299
3.2365 0.7743 350 3.4070
3.3671 0.7965 360 3.3916
3.3138 0.8186 370 3.2162
3.6234 0.8407 380 3.1907
3.7106 0.8628 390 3.2870
2.9935 0.8850 400 3.2960
3.4659 0.9071 410 3.2714
2.8749 0.9292 420 3.2891
3.378 0.9513 430 3.2819
3.1755 0.9735 440 3.2651
3.4684 0.9956 450 3.2572

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.3.1+cu121
  • Datasets 2.20.0
  • Tokenizers 0.19.1