swiftformer-xs-DMAE / README.md
Augusto777's picture
Model save
888144c verified
|
raw
history blame
5.27 kB
metadata
base_model: MBZUAI/swiftformer-xs
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: swiftformer-xs-DMAE
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: validation
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.45652173913043476

swiftformer-xs-DMAE

This model is a fine-tuned version of MBZUAI/swiftformer-xs on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 67319515540793508675715072.0000
  • Accuracy: 0.4565

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 0.015
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 64
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 40

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.86 3 67319515540793508675715072.0000 0.6739
No log 2.0 7 67319515540793508675715072.0000 0.1087
65998362497246039927422976.0000 2.86 10 67319515540793508675715072.0000 0.3261
65998362497246039927422976.0000 4.0 14 67319515540793508675715072.0000 0.4565
65998362497246039927422976.0000 4.86 17 67319515540793508675715072.0000 0.3261
69095061697085437542137856.0000 6.0 21 67319515540793508675715072.0000 0.4783
69095061697085437542137856.0000 6.86 24 67319515540793508675715072.0000 0.4565
69095061697085437542137856.0000 8.0 28 67319515540793508675715072.0000 0.4565
77610986341318196513996800.0000 8.86 31 67319515540793508675715072.0000 0.5
77610986341318196513996800.0000 10.0 35 67319515540793508675715072.0000 0.3478
77610986341318196513996800.0000 10.86 38 67319515540793508675715072.0000 0.3478
57288905682238366283726848.0000 12.0 42 67319515540793508675715072.0000 0.3478
57288905682238366283726848.0000 12.86 45 67319515540793508675715072.0000 0.4348
57288905682238366283726848.0000 14.0 49 67319515540793508675715072.0000 0.3696
74707823001602531749003264.0000 14.86 52 67319515540793508675715072.0000 0.3261
74707823001602531749003264.0000 16.0 56 67319515540793508675715072.0000 0.2826
74707823001602531749003264.0000 16.86 59 67319515540793508675715072.0000 0.4565
70449886504927853738459136.0000 18.0 63 67319515540793508675715072.0000 0.4348
70449886504927853738459136.0000 18.86 66 67319515540793508675715072.0000 0.4130
66191905736067400542978048.0000 20.0 70 67319515540793508675715072.0000 0.3478
66191905736067400542978048.0000 20.86 73 67319515540793508675715072.0000 0.4565
66191905736067400542978048.0000 22.0 77 67319515540793508675715072.0000 0.3478
63869401627606337277919232.0000 22.86 80 67319515540793508675715072.0000 0.4130
63869401627606337277919232.0000 24.0 84 67319515540793508675715072.0000 0.3261
63869401627606337277919232.0000 24.86 87 67319515540793508675715072.0000 0.5
63869386870211073156317184.0000 26.0 91 67319515540793508675715072.0000 0.4783
63869386870211073156317184.0000 26.86 94 67319515540793508675715072.0000 0.4565
63869386870211073156317184.0000 28.0 98 67319515540793508675715072.0000 0.4565
72385304135746204362342400.0000 28.86 101 67319515540793508675715072.0000 0.4565
72385304135746204362342400.0000 30.0 105 67319515540793508675715072.0000 0.5
72385304135746204362342400.0000 30.86 108 67319515540793508675715072.0000 0.5
65030638924441609083813888.0000 32.0 112 67319515540793508675715072.0000 0.4565
65030638924441609083813888.0000 32.86 115 67319515540793508675715072.0000 0.4565
65030638924441609083813888.0000 34.0 119 67319515540793508675715072.0000 0.4565
65998362497246039927422976.0000 34.29 120 67319515540793508675715072.0000 0.4565

Framework versions

  • Transformers 4.36.2
  • Pytorch 2.1.2+cu118
  • Datasets 2.16.1
  • Tokenizers 0.15.0