gpt2_bwgenerator

This model is a fine-tuned version of openai-community/gpt2 on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.0925

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: linear
  • num_epochs: 3

Training results

Training Loss Epoch Step Validation Loss
0.3023 0.1574 300 0.1644
0.167 0.3148 600 0.1376
0.1466 0.4722 900 0.1301
0.1354 0.6296 1200 0.1180
0.1258 0.7870 1500 0.1129
0.1198 0.9444 1800 0.1061
0.115 1.1018 2100 0.1030
0.1108 1.2592 2400 0.1013
0.1088 1.4166 2700 0.1000
0.1067 1.5740 3000 0.0982
0.1049 1.7314 3300 0.0974
0.1039 1.8888 3600 0.0960
0.1024 2.0462 3900 0.0952
0.1013 2.2036 4200 0.0947
0.1006 2.3610 4500 0.0944
0.0997 2.5184 4800 0.0935
0.0993 2.6758 5100 0.0927
0.099 2.8332 5400 0.0928
0.0983 2.9906 5700 0.0925

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1
  • Datasets 3.2.0
  • Tokenizers 0.21.0
Downloads last month
30
Safetensors
Model size
124M params
Tensor type
F32
·
Inference Providers NEW
This model is not currently available via any of the supported third-party Inference Providers, and the model is not deployed on the HF Inference API.

Model tree for NanQiangHF/gpt2_bwgenerator

Finetuned
(1360)
this model