Fill-Mask
Transformers
PyTorch
Spanish
bert
masked-lm
Inference Endpoints
jorgeortizfuentes commited on
Commit
647c107
·
1 Parent(s): e80ee52

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +9 -8
README.md CHANGED
@@ -1,16 +1,17 @@
1
  ---
 
 
2
  tags:
3
- - generated_from_trainer
4
- metrics:
5
- - accuracy
6
- model-index:
7
- - name: tulio-cased
8
- results: []
9
  ---
10
 
11
  # tulio-cased
12
 
13
- This model is a fine-tuned version of [dccuchile/bert-base-spanish-wwm-cased](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) on the None dataset.
14
  It achieves the following results on the evaluation set:
15
  - Loss: 1.765
16
 
@@ -48,4 +49,4 @@ The following hyperparameters were used during training:
48
  - Transformers 4.27.0.dev0
49
  - Pytorch 1.13.1+cu117
50
  - Datasets 2.9.0
51
- - Tokenizers 0.13.2
 
1
  ---
2
+ language:
3
+ - es
4
  tags:
5
+ - masked-lm
6
+ license: gpl-3.0
7
+ datasets:
8
+ - jorgeortizfuentes/chilean_spanish_corpus
9
+ pipeline_tag: fill-mask
 
10
  ---
11
 
12
  # tulio-cased
13
 
14
+ This model is a fine-tuned version of [dccuchile/bert-base-spanish-wwm-cased](https://huggingface.co/dccuchile/bert-base-spanish-wwm-cased) on Chilean Spanish Corpus.
15
  It achieves the following results on the evaluation set:
16
  - Loss: 1.765
17
 
 
49
  - Transformers 4.27.0.dev0
50
  - Pytorch 1.13.1+cu117
51
  - Datasets 2.9.0
52
+ - Tokenizers 0.13.2