fixed-distil-xlstm / README.md
thiomajid's picture
Completed epoch number 10
1a8f03b verified
metadata
library_name: transformers
tags:
  - generated_from_trainer
model-index:
  - name: fixed-distil-xlstm
    results: []

fixed-distil-xlstm

This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 3.4315

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.0002
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 32
  • optimizer: Use adamw_torch with betas=(0.9,0.999) and epsilon=1e-08 and optimizer_args=No additional optimizer arguments
  • lr_scheduler_type: cosine
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 2
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
12.3423 0.32 100 3.5629
12.2017 0.64 200 3.5453
12.1637 0.96 300 3.4930
11.3792 1.2784 400 3.4661
11.4163 1.5984 500 3.4403
11.5901 1.9184 600 3.4315

Framework versions

  • Transformers 4.47.1
  • Pytorch 2.5.1+cu121
  • Datasets 3.2.0
  • Tokenizers 0.21.0