ECBERT-base
This model is a fine-tuned version of answerdotai/ModernBERT-base on Gorodnichenko, Y., Pham, T., & Talavera, O. (2023). Data and Code for: The Voice of Monetary Policy (Version v1) [Dataset]. ICPSR - Interuniversity Consortium for Political and Social Research. https://doi.org/10.3886/E178302V1. It achieves the following results on the evaluation set:
- Loss: 0.4662
- Accuracy: 84.34%
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
Gorodnichenko, Y., Pham, T., & Talavera, O. (2023). Data and Code for: The Voice of Monetary Policy (Version v1) [Dataset]. ICPSR - Interuniversity Consortium for Political and Social Research. https://doi.org/10.3886/E178302V1.
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-5
- train_batch_size: 16
- eval_batch_size: 8
- seed: 42
- lr_scheduler_type: linear
- epochs: 10
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss |
---|---|---|---|
No log | 1 | 63 | 0.9462559819221497 |
1.0232 | 1.5873 | 100 | N/A |
No log | 2 | 126 | 0.5972977876663208 |
No log | 3 | 189 | 0.4892125129699707 |
0.5002 | 3.1746 | 200 | N/A |
No log | 4 | 252 | 0.4661949574947357 |
0.2978 | 4.7619 | 300 | N/A |
No log | 5 | 315 | 0.4729757606983185 |
No log | 6 | 378 | 0.6107903718948364 |
0.2891 | 6.34921 | 400 | N/A |
No log | 7 | 441 | 0.49726468324661255 |
0.2692 | 7.93651 | 500 | N/A |
No log | 8 | 504 | 0.4903869032859802 |
No log | 9 | 567 | 0.568524956703186 |
0.1863 | 9.52381 | 600 | N/A |
No log | 10 | 630 | 0.6198205351829529 |
No log | 10 | 630 | 0.4661949574947357 |
Framework versions
- Transformers 4.48.0.dev0
- Pytorch 2.5.1+cu121
- Datasets 3.2.0
- Tokenizers 0.21.0
- Downloads last month
- 11
Model tree for Graimond/ECBERT-base
Base model
answerdotai/ModernBERT-base