File size: 751 Bytes
2a8a454
 
 
 
 
 
 
 
 
 
3bfba68
7b105e0
3bfba68
 
 
7b105e0
3bfba68
4ec59be
3bfba68
31181e7
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
---
license: mit
datasets:
- omarmomen/babylm_10M
language:
- en
metrics:
- perplexity
library_name: transformers
pipeline_tag: fill-mask
---
# Model Card for omarmomen/sf_ip_babylm_1

This model is part of the experiments in my master's thesis titled "Linguistic Structure Induction from Language Models" (https://arxiv.org/abs/2403.09714).

"omarmomen/sf_ip_babylm_1" is the StructFormer (SF_m=4) referred to in Chapter 5 (p. 59); it is an in-between parser variant with the parser network positioned after 4 transformer blocks.

The model is trained on the BabyLM 10M dataset, with a RobertaTokenizer pretrained on the BabyLM 10M dataset with 16K tokens (https://huggingface.co/omarmomen/babylm_bpe_tokenizer_16k).

https://arxiv.org/abs/2403.09714