Zamoranesis's picture
Zamoranesis/mental_bert_classifier
6b117a5
|
raw
history blame
2.87 kB
metadata
license: cc-by-nc-4.0
base_model: Zamoranesis/mental_bert
tags:
  - generated_from_trainer
metrics:
  - f1
model-index:
  - name: mental_bert_classifier
    results: []

mental_bert_classifier

This model is a fine-tuned version of Zamoranesis/mental_bert on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3709
  • F1 Class 0: 0.8475
  • F1 Class 1: 0.9157
  • F1 Class 2: 0.7692
  • F1 Class 3: 0.8764
  • F1 Class 4: 0.8772
  • F1: 0.8572

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0005
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • lr_scheduler_warmup_steps: 100
  • training_steps: 1000

Training results

Training Loss Epoch Step Validation Loss F1 Class 0 F1 Class 1 F1 Class 2 F1 Class 3 F1 Class 4 F1
1.3938 6.25 100 1.0235 0.7458 0.8000 0.6957 0.8315 0.8197 0.7785
0.7425 12.5 200 0.6302 0.7667 0.875 0.7755 0.8478 0.8475 0.8225
0.5225 18.75 300 0.5192 0.8276 0.9268 0.7843 0.8667 0.8475 0.8506
0.4433 25.0 400 0.4599 0.8276 0.9157 0.7692 0.8636 0.8475 0.8447
0.3862 31.25 500 0.4208 0.8475 0.9157 0.7692 0.8636 0.8621 0.8516
0.3702 37.5 600 0.3987 0.8475 0.9157 0.7692 0.8764 0.8772 0.8572
0.3437 43.75 700 0.3872 0.8475 0.9157 0.7692 0.8764 0.8772 0.8572
0.337 50.0 800 0.3759 0.8475 0.9157 0.7692 0.8764 0.8772 0.8572
0.3253 56.25 900 0.3727 0.8475 0.9157 0.7692 0.8764 0.8772 0.8572
0.3236 62.5 1000 0.3709 0.8475 0.9157 0.7692 0.8764 0.8772 0.8572

Framework versions

  • Transformers 4.33.3
  • Pytorch 2.0.1+cu118
  • Datasets 2.14.5
  • Tokenizers 0.13.3