Update README.md
Browse files
README.md
CHANGED
@@ -9,11 +9,11 @@ metrics:
|
|
9 |
library_name: transformers
|
10 |
pipeline_tag: fill-mask
|
11 |
---
|
12 |
-
# Model Card for
|
13 |
|
14 |
This model is part of the experiments in my master's thesis titled "Linguistic Structure Induction from Language Models" (https://arxiv.org/abs/2403.09714).
|
15 |
|
16 |
-
"omarmomen/sf_ip_babylm_1" is the StructFormer (
|
17 |
|
18 |
-
The model is trained on the BabyLM 10M dataset, with a RobertaTokenizer pretrained on the BabyLM 10M dataset with 16K tokens.
|
19 |
|
|
|
9 |
library_name: transformers
|
10 |
pipeline_tag: fill-mask
|
11 |
---
|
12 |
+
# Model Card for omarmomen/sf_ip_babylm_1
|
13 |
|
14 |
This model is part of the experiments in my master's thesis titled "Linguistic Structure Induction from Language Models" (https://arxiv.org/abs/2403.09714).
|
15 |
|
16 |
+
"omarmomen/sf_ip_babylm_1" is the StructFormer (SF_m=4) referred to in Chapter 5 (p. 59); it is an in-between parser variant with the parser network positioned after 4 transformer blocks.
|
17 |
|
18 |
+
The model is trained on the BabyLM 10M dataset, with a RobertaTokenizer pretrained on the BabyLM 10M dataset with 16K tokens (https://huggingface.co/omarmomen/babylm_bpe_tokenizer_32k).
|
19 |
|