|
--- |
|
language: |
|
- es |
|
tags: |
|
- albert |
|
- spanish |
|
- OpenCENIA |
|
datasets: |
|
- large_spanish_corpus |
|
--- |
|
|
|
# ALBERT XLarge Spanish |
|
|
|
This is an [ALBERT](https://github.com/google-research/albert) model trained on a [big spanish corpora](https://github.com/josecannete/spanish-corpora). |
|
The model was trained on a single TPU v3-8 with the following hyperparameters and steps/time: |
|
- LR: 0.0003125 |
|
- Batch Size: 128 |
|
- Warmup ratio: 0.00078125 |
|
- Warmup steps: 6250 |
|
- Goal steps: 8000000 |
|
- Total steps: 2775000 |
|
- Total training time (aprox): 64.2 days. |
|
|
|
## Training loss |
|
![https://drive.google.com/uc?export=view&id=1rw0vvqZY9LZAzRUACLjmP18Fc6D1fv7x](https://drive.google.com/uc?export=view&id=1rw0vvqZY9LZAzRUACLjmP18Fc6D1fv7x) |