Narrativa commited on
Commit
5c237a7
·
1 Parent(s): 7ba08c8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +5 -9
README.md CHANGED
@@ -7,10 +7,8 @@ widget:
7
  tags:
8
  - Long documents
9
  - longformer
10
- - bertin
11
  - spanish
12
- datasets:
13
- - spanish_large_corpus
14
 
15
  ---
16
 
@@ -18,9 +16,7 @@ datasets:
18
 
19
  ## [Longformer](https://arxiv.org/abs/2004.05150) is a Transformer model for long documents.
20
 
21
- `legal-longformer-base-4096` is a BERT-like model started from the RoBERTa checkpoint (**[RoBERTalex](PlanTL-GOB-ES/RoBERTalex)** in this case) and pre-trained for *MLM* on long documents (f from O's `all_wikis`). It supports sequences of length up to 4,096!
22
-
23
-
24
 
25
  **Longformer** uses a combination of a sliding window (*local*) attention and *global* attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations.
26
 
@@ -31,12 +27,12 @@ This model was made following the research done by [Iz Beltagy and Matthew E. Pe
31
  If you want to cite this model you can use this:
32
 
33
  ```bibtex
34
- @misc{mromero2022longformer-base-4096-spanish,
35
- title={Spanish LongFormer by Manuel Romero},
36
  author={Romero, Manuel},
37
  publisher={Hugging Face},
38
  journal={Hugging Face Hub},
39
- howpublished={\url{https://huggingface.co/mrm8488/longformer-base-4096-spanish}},
40
  year={2022}
41
  }
42
  ```
 
7
  tags:
8
  - Long documents
9
  - longformer
10
+ - robertalex
11
  - spanish
 
 
12
 
13
  ---
14
 
 
16
 
17
  ## [Longformer](https://arxiv.org/abs/2004.05150) is a Transformer model for long documents.
18
 
19
+ `legal-longformer-base-4096` is a BERT-like model started from the RoBERTa checkpoint (**[RoBERTalex](PlanTL-GOB-ES/RoBERTalex)** in this case) and pre-trained for *MLM* on long documents from the [Spanish Legal Domain Corpora](https://zenodo.org/record/5495529/#.Y205lpHMKV5). It supports sequences of length up to **4,096**!
 
 
20
 
21
  **Longformer** uses a combination of a sliding window (*local*) attention and *global* attention. Global attention is user-configured based on the task to allow the model to learn task-specific representations.
22
 
 
27
  If you want to cite this model you can use this:
28
 
29
  ```bibtex
30
+ @misc{narrativa2022legal-longformer-base-4096-spanish,
31
+ title={Legal Spanish LongFormer by Narrativa},
32
  author={Romero, Manuel},
33
  publisher={Hugging Face},
34
  journal={Hugging Face Hub},
35
+ howpublished={\url{https://huggingface.co/Narrativa/legal-longformer-base-4096-spanish}},
36
  year={2022}
37
  }
38
  ```