ThunBERT_bs8_lr5 / README.md
damgomz's picture
Upload README.md with huggingface_hub
969d3a8 verified
|
raw
history blame
3.08 kB
---
language: en
tags:
- fill-mask
kwargs:
timestamp: '2024-05-11T15:12:34'
project_name: ThunBERT_bs8_lr5_emissions_tracker
run_id: 57d7b2c3-c944-47c7-a355-25d9fbaa40f9
duration: 174517.94656181335
emissions: 0.182665335387375
emissions_rate: 1.0466851059508425e-06
cpu_power: 42.5
gpu_power: 0.0
ram_power: 37.5
cpu_energy: 2.0602784545400072
gpu_energy: 0
ram_energy: 1.8178841192349864
energy_consumed: 3.8781625737749743
country_name: Switzerland
country_iso_code: CHE
region: .nan
cloud_provider: .nan
cloud_region: .nan
os: Linux-5.14.0-70.30.1.el9_0.x86_64-x86_64-with-glibc2.34
python_version: 3.10.4
codecarbon_version: 2.3.4
cpu_count: 4
cpu_model: Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz
gpu_count: .nan
gpu_model: .nan
longitude: .nan
latitude: .nan
ram_total_size: 100
tracking_mode: machine
on_cloud: N
pue: 1.0
---
## Environmental Impact (CODE CARBON DEFAULT)
| Metric | Value |
|--------------------------|---------------------------------|
| Duration (in seconds) | 174517.94656181335 |
| Emissions (Co2eq in kg) | 0.182665335387375 |
| CPU power (W) | 42.5 |
| GPU power (W) | [No GPU] |
| RAM power (W) | 37.5 |
| CPU energy (kWh) | 2.0602784545400072 |
| GPU energy (kWh) | [No GPU] |
| RAM energy (kWh) | 1.8178841192349864 |
| Consumed energy (kWh) | 3.8781625737749743 |
| Country name | Switzerland |
| Cloud provider | nan |
| Cloud region | nan |
| CPU count | 4 |
| CPU model | Intel(R) Xeon(R) Platinum 8360Y CPU @ 2.40GHz |
| GPU count | nan |
| GPU model | nan |
## Environmental Impact (for one core)
| Metric | Value |
|--------------------------|---------------------------------|
| CPU energy (kWh) | 0.33594704713149065 |
| Emissions (Co2eq in kg) | 0.06835286240337689 |
## Note
15 May 2024
## My Config
| Config | Value |
|--------------------------|-----------------|
| checkpoint | albert-base-v2 |
| model_name | ThunBERT_bs8_lr5 |
| sequence_length | 400 |
| num_epoch | 6 |
| learning_rate | 5e-05 |
| batch_size | 8 |
| weight_decay | 0.0 |
| warm_up_prop | 0.0 |
| drop_out_prob | 0.1 |
| packing_length | 100 |
| train_test_split | 0.2 |
| num_steps | 82627 |
## Training and Testing steps
Epoch | Train Loss | Test Loss
---|---|---
| 0.0 | 7.067743 | 3.953574 |
| 0.5 | 3.522399 | 3.376064 |
| 1.0 | 3.303520 | 3.226876 |
| 1.5 | 3.154914 | 3.167486 |
| 2.0 | 3.058335 | 3.051049 |
| 2.5 | 2.983440 | 2.994546 |
| 3.0 | 2.966602 | 2.926526 |
| 3.5 | 2.846851 | 2.879127 |
| 4.0 | 2.785210 | 2.832286 |
| 4.5 | 2.718725 | 2.795912 |
| 5.0 | 2.670722 | 2.733300 |
| 5.5 | 2.628934 | 2.693741 |