Environmental Impact (CODE CARBON DEFAULT)
Metric | Value |
---|---|
Duration (in seconds) | 169047.08735966682 |
Emissions (Co2eq in kg) | 0.1769390204060543 |
CPU power (W) | 42.5 |
GPU power (W) | [No GPU] |
RAM power (W) | 37.5 |
CPU energy (kWh) | 1.9956917336564937 |
GPU energy (kWh) | [No GPU] |
RAM energy (kWh) | 1.7608956085666998 |
Consumed energy (kWh) | 3.756587342223187 |
Country name | Switzerland |
Cloud provider | nan |
Cloud region | nan |
CPU count | 4 |
CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
GPU count | nan |
GPU model | nan |
Environmental Impact (for one core)
Metric | Value |
---|---|
CPU energy (kWh) | 0.3254156431673586 |
Emissions (Co2eq in kg) | 0.0662101092158695 |
Note
15 May 2024
My Config
Config | Value |
---|---|
checkpoint | albert-base-v2 |
model_name | ThunBERT_bs16_lr4 |
sequence_length | 400 |
num_epoch | 6 |
learning_rate | 0.0005 |
batch_size | 16 |
weight_decay | 0.0 |
warm_up_prop | 0.0 |
drop_out_prob | 0.1 |
packing_length | 100 |
train_test_split | 0.2 |
num_steps | 41045 |
Training and Testing steps
Epoch | Train Loss | Test Loss |
---|---|---|
0.0 | 6.672004 | 11.475011 |
0.5 | 7.858635 | 7.770452 |
1.0 | 7.738486 | 7.745920 |
1.5 | 7.705243 | 7.718904 |
2.0 | 7.692092 | 7.719356 |
2.5 | 7.684155 | 7.708430 |
3.0 | 7.674921 | 7.695158 |
- Downloads last month
- 14
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social
visibility and check back later, or deploy to Inference Endpoints (dedicated)
instead.