vmkhlv commited on
Commit
31ac14a
·
verified ·
1 Parent(s): 40897bd

Update README.md

Browse files



@alenusch
added the correct citation.

Files changed (1) hide show
  1. README.md +23 -22
README.md CHANGED
@@ -78,7 +78,7 @@ datasets:
78
  thumbnail: "https://github.com/sberbank-ai/mgpt"
79
  ---
80
 
81
- # Multilingual GPT model
82
 
83
  We introduce a family of autoregressive GPT-like models with 1.3 billion parameters trained on 61 languages from 25 language families using Wikipedia and Colossal Clean Crawled Corpus.
84
 
@@ -88,32 +88,33 @@ We reproduce the GPT-3 architecture using GPT-2 sources and the sparse attention
88
  The source code for the mGPT XL model is available on [Github](https://github.com/sberbank-ai/mgpt)
89
 
90
  ## Paper
91
- mGPT: Few-Shot Learners Go Multilingual
 
 
 
92
 
93
  [Abstract](https://arxiv.org/abs/2204.07580) [PDF](https://arxiv.org/pdf/2204.07580.pdf)
94
 
95
- ![](https://habrastorage.org/webt/1q/ru/yt/1qruytul6m2m-upyk9frq3pgrds.png)
96
-
97
  ```
98
- @misc{https://doi.org/10.48550/arxiv.2204.07580,
99
- doi = {10.48550/ARXIV.2204.07580},
100
-
101
- url = {https://arxiv.org/abs/2204.07580},
102
-
103
- author = {Shliazhko, Oleh and Fenogenova, Alena and Tikhonova, Maria and Mikhailov, Vladislav and Kozlova, Anastasia and Shavrina, Tatiana},
104
-
105
- keywords = {Computation and Language (cs.CL), Artificial Intelligence (cs.AI), FOS: Computer and information sciences, FOS: Computer and information sciences, I.2; I.2.7, 68-06, 68-04, 68T50, 68T01},
106
-
107
- title = {mGPT: Few-Shot Learners Go Multilingual},
108
-
109
- publisher = {arXiv},
110
-
111
- year = {2022},
112
-
113
- copyright = {Creative Commons Attribution 4.0 International}
 
114
  }
115
-
116
- ```
117
 
118
 
119
  ## Languages
 
78
  thumbnail: "https://github.com/sberbank-ai/mgpt"
79
  ---
80
 
81
+ # mGPT 1.3B
82
 
83
  We introduce a family of autoregressive GPT-like models with 1.3 billion parameters trained on 61 languages from 25 language families using Wikipedia and Colossal Clean Crawled Corpus.
84
 
 
88
  The source code for the mGPT XL model is available on [Github](https://github.com/sberbank-ai/mgpt)
89
 
90
  ## Paper
91
+
92
+ **mGPT: Few-Shot Learners Go Multilingual**
93
+
94
+ Published at TACL 2024 (MIT Press). Presented at EMNLP 2023.
95
 
96
  [Abstract](https://arxiv.org/abs/2204.07580) [PDF](https://arxiv.org/pdf/2204.07580.pdf)
97
 
 
 
98
  ```
99
+ @article{shliazhko-etal-2024-mgpt,
100
+ title = "m{GPT}: Few-Shot Learners Go Multilingual",
101
+ author = "Shliazhko, Oleh and
102
+ Fenogenova, Alena and
103
+ Tikhonova, Maria and
104
+ Kozlova, Anastasia and
105
+ Mikhailov, Vladislav and
106
+ Shavrina, Tatiana",
107
+ journal = "Transactions of the Association for Computational Linguistics",
108
+ volume = "12",
109
+ year = "2024",
110
+ address = "Cambridge, MA",
111
+ publisher = "MIT Press",
112
+ url = "https://aclanthology.org/2024.tacl-1.4",
113
+ doi = "10.1162/tacl_a_00633",
114
+ pages = "58--79",
115
+ abstract = "This paper introduces mGPT, a multilingual variant of GPT-3, pretrained on 61 languages from 25 linguistically diverse language families using Wikipedia and the C4 Corpus. We detail the design and pretraining procedure. The models undergo an intrinsic and extrinsic evaluation: language modeling in all languages, downstream evaluation on cross-lingual NLU datasets and benchmarks in 33 languages, and world knowledge probing in 23 languages. The in-context learning abilities are on par with the contemporaneous language models while covering a larger number of languages, including underrepresented and low-resource languages of the Commonwealth of Independent States and the indigenous peoples in Russia. The source code and the language models are publicly available under the MIT license.",
116
  }
117
+ ```
 
118
 
119
 
120
  ## Languages