File size: 2,913 Bytes
bd9791e
ca72726
bd9791e
d688678
 
 
ca72726
da611fc
bd9791e
ca72726
 
 
 
da611fc
 
bd9791e
 
 
204450f
bd9791e
 
 
 
72bf565
 
 
 
 
 
 
 
 
bd9791e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
---
language: pt
tags:
- t5
- pytorch
- tensorflow
license: mit
inference: false
---


<!-- datasets:
- brWaC -->
<!-- widget:
- inference: false -->
# Portuguese T5 (aka "PTT5")

## Introduction
PTT5 is a T5 model pretrained in the BrWac corpus, a large  collection  of  web  pages in Portuguese, improving T5's performance on Portuguese sentence similarity and entailment tasks. It's available in three sizes (small, base and large) and two vocabularies (Google's T5 original and ours, trained on Portuguese Wikipedia).

For further information or requests, please go to [PTT5 repository](https://github.com/unicamp-dl/PTT5).

## Available models
<!-- Com link -->
| Model                                    | Architecture                                                   | #Params  | Vocabulary         |
| :-:                                      | :-:                                                            | :-:      | :-:                |            
| [unicamp-dl/ptt5-small-t5-vocab](https://huggingface.co/unicamp-dl/ptt5-small-t5-vocab)                   | t5-small | 60M  | Google's T5 |
| [unicamp-dl/ptt5-base-t5-vocab](https://huggingface.co/unicamp-dl/ptt5-base-t5-vocab)                     | t5-base  | 220M | Google's T5 |
| [unicamp-dl/ptt5-large-t5-vocab](https://huggingface.co/unicamp-dl/ptt5-large-t5-vocab)                   | t5-large | 740M | Google's T5 |
| [unicamp-dl/ptt5-small-portuguese-vocab](https://huggingface.co/unicamp-dl/ptt5-small-portuguese-vocab)   | t5-small | 60M  | Portuguese  |
| [unicamp-dl/ptt5-base-portuguese-vocab](https://huggingface.co/unicamp-dl/ptt5-base-portuguese-vocab)     | t5-base  | 220M | Portuguese  |
| [unicamp-dl/ptt5-large-portuguese-vocab](https://huggingface.co/unicamp-dl/ptt5-large-portuguese-vocab)   | t5-large | 740M | Portuguese  |


## Usage
```python
# Tokenizer 
from transformers import AutoTokenizer # or T5Tokenizer

# PyTorch (bare model, baremodel + language modeling head)
from transformers import T5Model, T5ForConditionalGeneration

# Tensorflow (bare model, baremodel + language modeling head)
from transformers import TFT5Model, TFT5ForConditionalGeneration

model_name = 'unicamp-dl/ptt5-base-portuguese-vocab'

tokenizer = T5Tokenizer.from_pretrained(model_name)

# PyTorch
model_pt = T5ForConditionalGeneration.from_pretrained(model_name)

# TensorFlow
model_tf = TFT5ForConditionalGeneration.from_pretrained(model_name)
```


## Citation
We are preparing an arXiv submission and soon will provide a citation. For now, if you need to cite use:
```bibtex
@misc{ptt5_2020,
  Author = {Carmo, Diedre and Piau, Marcos and Campiotti, Israel and Nogueira, Rodrigo and Lotufo, Roberto},
  Title = {PTT5: Pre-training and validating the T5 transformer in Brazilian Portuguese data},
  Year = {2020},
  Publisher = {GitHub},
  Journal = {GitHub repository},
  Howpublished = {\url{https://github.com/unicamp-dl/PTT5}}
}
```