kaixkhazaki
commited on
Update README.md
Browse files
README.md
CHANGED
@@ -1,9 +1,34 @@
|
|
1 |
-
|
2 |
-
|
3 |
-
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
4 |
## Model description
|
5 |
|
6 |
-
|
7 |
|
8 |
|
9 |
|
@@ -17,18 +42,21 @@ https://huggingface.co/datasets/pierreguillou/DocLayNet-base
|
|
17 |
hyperparameters:
|
18 |
|
19 |
{
|
20 |
-
'batch_size':
|
21 |
'num_epochs': 20,
|
22 |
'learning_rate': 1e-4,
|
23 |
-
'weight_decay': 0.
|
24 |
-
'warmup_ratio': 0.
|
25 |
'gradient_clip': 0.1,
|
26 |
'dropout_rate': 0.1,
|
27 |
'label_smoothing': 0.1
|
28 |
'optmizer': 'AdamW'
|
29 |
}
|
|
|
30 |
## Evaluation results
|
31 |
-
|
|
|
|
|
32 |
|
33 |
## Usage
|
34 |
|
|
|
1 |
+
---
|
2 |
+
datasets:
|
3 |
+
- pierreguillou/DocLayNet-base
|
4 |
+
metrics:
|
5 |
+
- accuracy
|
6 |
+
base_model:
|
7 |
+
- facebook/deit-base-distilled-patch16-224
|
8 |
+
library_name: transformers
|
9 |
+
tags:
|
10 |
+
- vision
|
11 |
+
- document-layout-analysis
|
12 |
+
- document-classification
|
13 |
+
- deit
|
14 |
+
- doclaynet
|
15 |
+
---
|
16 |
+
|
17 |
+
# Data-efficient Image Transformer(DeiT) for Document Classification(DocLayNet)
|
18 |
+
|
19 |
+
This model is a fine-tuned Data-efficient Image Transformer(DeiT) for document layout classification based on the DocLayNet dataset.
|
20 |
+
|
21 |
+
Trained on images of the document categories from DocLayNet dataset where the categories namely(with their indexes) are :
|
22 |
+
|
23 |
+
{'financial_reports': 0,
|
24 |
+
'government_tenders': 1,
|
25 |
+
'laws_and_regulations': 2,
|
26 |
+
'manuals': 3,
|
27 |
+
'patents': 4,
|
28 |
+
'scientific_articles': 5}
|
29 |
## Model description
|
30 |
|
31 |
+
DeiT(facebook/deit-base-distilled-patch16-224) finetuned on document classification
|
32 |
|
33 |
|
34 |
|
|
|
42 |
hyperparameters:
|
43 |
|
44 |
{
|
45 |
+
'batch_size': 128,
|
46 |
'num_epochs': 20,
|
47 |
'learning_rate': 1e-4,
|
48 |
+
'weight_decay': 0.1,
|
49 |
+
'warmup_ratio': 0.1,
|
50 |
'gradient_clip': 0.1,
|
51 |
'dropout_rate': 0.1,
|
52 |
'label_smoothing': 0.1
|
53 |
'optmizer': 'AdamW'
|
54 |
}
|
55 |
+
|
56 |
## Evaluation results
|
57 |
+
|
58 |
+
Test Loss: 0.8134, Test Acc: 81.56%
|
59 |
+
|
60 |
|
61 |
## Usage
|
62 |
|