Create new file
Browse files
README.md
ADDED
@@ -0,0 +1,16 @@
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
|
1 |
+
# <a name="introduction"></a> BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese
|
2 |
+
|
3 |
+
|
4 |
+
The pre-trained model `vinai/bartpho-word-base` is the "base" variant of `BARTpho-word`, which uses the "base" architecture and pre-training scheme of the sequence-to-sequence denoising model [BART](https://github.com/pytorch/fairseq/tree/main/examples/bart). The general architecture and experimental results of BARTpho can be found in our [paper](https://arxiv.org/abs/2109.09701):
|
5 |
+
|
6 |
+
@article{bartpho,
|
7 |
+
title = {{BARTpho: Pre-trained Sequence-to-Sequence Models for Vietnamese}},
|
8 |
+
author = {Nguyen Luong Tran and Duong Minh Le and Dat Quoc Nguyen},
|
9 |
+
journal = {arXiv preprint},
|
10 |
+
volume = {arXiv:2109.09701},
|
11 |
+
year = {2021}
|
12 |
+
}
|
13 |
+
|
14 |
+
**Please CITE** our paper when BARTpho is used to help produce published results or incorporated into other software.
|
15 |
+
|
16 |
+
For further information or requests, please go to [BARTpho's homepage](https://github.com/VinAIResearch/BARTpho)!
|