whaleloops
commited on
Commit
·
88fd15a
1
Parent(s):
ec7b120
Update README.md
Browse files
README.md
CHANGED
@@ -1,3 +1,46 @@
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
---
|
2 |
license: apache-2.0
|
3 |
---
|
4 |
+
|
5 |
+
KEPTlongfomer pretrained using [contrastive learning](https://arxiv.org/pdf/2210.03304.pdf).
|
6 |
+
|
7 |
+
|
8 |
+
First, The model was first inited from [clinical longformer](https://huggingface.co/yikuan8/Clinical-Longformer).
|
9 |
+
|
10 |
+
And then pretrained with Hierarchical Self-Alignment Pretrainumls (HSAP) using Knowledge Graph UMLS.
|
11 |
+
This includes (a) Hierarchy, (b) Synonym, (c) Abbreviation. For more info, see section 3.3 in [paper](https://arxiv.org/pdf/2210.03304.pdf).
|
12 |
+
|
13 |
+
See [here](https://github.com/whaleloops/KEPT/tree/rerank300) for how to use this on auto ICD coding.
|
14 |
+
|
15 |
+
With the following result:
|
16 |
+
| Metric | Score |
|
17 |
+
| ------------- | ------------- |
|
18 |
+
|rec_micro| =0.5729403619819988|
|
19 |
+
|rec_macro| =0.11342156911120573|
|
20 |
+
|rec_at_8| =0.4094837705486378|
|
21 |
+
|rec_at_75| =0.8470734920535119|
|
22 |
+
|rec_at_50| =0.8005338782352|
|
23 |
+
|rec_at_5| =0.2891628170355805|
|
24 |
+
|rec_at_15| =0.5768778119750537|
|
25 |
+
|prec_micro| =0.6411968713105065|
|
26 |
+
|prec_macro| =0.12227610414493029|
|
27 |
+
|prec_at_8| =0.7760972716488731|
|
28 |
+
|prec_at_75| =0.197504942665085|
|
29 |
+
|prec_at_50| =0.2768090154211151|
|
30 |
+
|prec_at_5| =0.8483392645314354|
|
31 |
+
|prec_at_15| =0.6178529062870699|
|
32 |
+
|f1_micro| =0.6051499904242899|
|
33 |
+
|f1_macro| =0.11768251595637802|
|
34 |
+
|f1_at_8| =0.536107150495997|
|
35 |
+
|f1_at_75| =0.32032290907137506|
|
36 |
+
|f1_at_50| =0.411373195944102|
|
37 |
+
|f1_at_5| =0.43131028155283435|
|
38 |
+
|f1_at_15| =0.5966627077602488|
|
39 |
+
|auc_micro| =0.9651754312635265|
|
40 |
+
|auc_macro| =0.8566590059725866|
|
41 |
+
|acc_micro| =0.43384592341105344|
|
42 |
+
|acc_macro| =0.08639139221100567|
|
43 |
+
|
44 |
+
|
45 |
+
A sister model is available [here](https://huggingface.co/whaleloops/KEPTlongformer-PMM3/).
|
46 |
+
|