swiftformer-xs-DMAE-ALT

This model is a fine-tuned version of MBZUAI/swiftformer-xs on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 6162013035452755345408.0000
  • Accuracy: 0.6522

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 1.5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.01
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.86 3 6162013035452755345408.0000 0.4348
No log 2.0 7 6162013035452755345408.0000 0.5217
6041083954518472785920.0000 2.86 10 6162013035452755345408.0000 0.6304
6041083954518472785920.0000 4.0 14 6162013035452755345408.0000 0.6304
6041083954518472785920.0000 4.86 17 6162013035452755345408.0000 0.6087
6324536912185469698048.0000 6.0 21 6162013035452755345408.0000 0.6087
6324536912185469698048.0000 6.86 24 6162013035452755345408.0000 0.5870
6324536912185469698048.0000 8.0 28 6162013035452755345408.0000 0.5870
7104031645049785679872.0000 8.86 31 6162013035452755345408.0000 0.6304
7104031645049785679872.0000 10.0 35 6162013035452755345408.0000 0.6304
7104031645049785679872.0000 10.86 38 6162013035452755345408.0000 0.6304
5243873411799968645120.0000 12.0 42 6162013035452755345408.0000 0.6087
5243873411799968645120.0000 12.86 45 6162013035452755345408.0000 0.6087
5243873411799968645120.0000 14.0 49 6162013035452755345408.0000 0.6304
6838294497236975878144.0000 14.86 52 6162013035452755345408.0000 0.6304
6838294497236975878144.0000 16.0 56 6162013035452755345408.0000 0.6304
6838294497236975878144.0000 16.86 59 6162013035452755345408.0000 0.6304
6448545779724929990656.0000 18.0 63 6162013035452755345408.0000 0.6304
6448545779724929990656.0000 18.86 66 6162013035452755345408.0000 0.6304
6058800665092585160704.0000 20.0 70 6162013035452755345408.0000 0.6304
6058800665092585160704.0000 20.86 73 6162013035452755345408.0000 0.6304
6058800665092585160704.0000 22.0 77 6162013035452755345408.0000 0.6304
5846209595762449317888.0000 22.86 80 6162013035452755345408.0000 0.6304
5846209595762449317888.0000 24.0 84 6162013035452755345408.0000 0.6304
5846209595762449317888.0000 24.86 87 6162013035452755345408.0000 0.6522
5846210496482374582272.0000 26.0 91 6162013035452755345408.0000 0.6304
5846210496482374582272.0000 26.86 94 6162013035452755345408.0000 0.6304
5846210496482374582272.0000 28.0 98 6162013035452755345408.0000 0.6522
6625704778986728456192.0000 28.86 101 6162013035452755345408.0000 0.6304
6625704778986728456192.0000 30.0 105 6162013035452755345408.0000 0.6304
6625704778986728456192.0000 30.86 108 6162013035452755345408.0000 0.6522
5952505355607498293248.0000 32.0 112 6162013035452755345408.0000 0.6304
5952505355607498293248.0000 32.86 115 6162013035452755345408.0000 0.6304
5952505355607498293248.0000 34.0 119 6162013035452755345408.0000 0.6304
6041083504158509629440.0000 34.29 120 6162013035452755345408.0000 0.6304

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.2+cu118
  • Datasets 2.16.1
  • Tokenizers 0.15.0
Downloads last month
2
Safetensors
Model size
3.04M params
Tensor type
F32
·
Inference API
Unable to determine this model's library. Check the docs .

Model tree for Augusto777/swiftformer-xs-DMAE-ALT

Finetuned
(17)
this model

Evaluation results