Upload tokenizer
#23
by
ArthurZ
HF staff
- opened
Update the tokenizer to make sure that the length is the same as config.vocab_size
following the issue reported here on github.
The normalized / not normalized info is not super important for previous transformers versions, so this fixes them. Before the fast had normalized=False
but the slow items had normalized=True