kisa-misa's picture
End of training
d05da49 verified
metadata
license: apache-2.0
base_model: microsoft/swin-tiny-patch4-window7-224
tags:
  - generated_from_trainer
datasets:
  - imagefolder
metrics:
  - accuracy
model-index:
  - name: swin-tiny-patch4-window7-224-finetuned-phones
    results:
      - task:
          name: Image Classification
          type: image-classification
        dataset:
          name: imagefolder
          type: imagefolder
          config: default
          split: train
          args: default
        metrics:
          - name: Accuracy
            type: accuracy
            value: 0.8653846153846154

swin-tiny-patch4-window7-224-finetuned-phones

This model is a fine-tuned version of microsoft/swin-tiny-patch4-window7-224 on the imagefolder dataset. It achieves the following results on the evaluation set:

  • Loss: 0.3938
  • Accuracy: 0.8654

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 32
  • eval_batch_size: 32
  • seed: 42
  • gradient_accumulation_steps: 4
  • total_train_batch_size: 128
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • lr_scheduler_warmup_ratio: 0.1
  • num_epochs: 30

Training results

Training Loss Epoch Step Validation Loss Accuracy
No log 0.9333 7 0.6743 0.5673
0.6763 2.0 15 0.6166 0.6923
0.635 2.9333 22 0.5646 0.7404
0.5724 4.0 30 0.5074 0.7308
0.5724 4.9333 37 0.4809 0.7692
0.527 6.0 45 0.4597 0.7692
0.5304 6.9333 52 0.4758 0.7596
0.4597 8.0 60 0.4343 0.7885
0.4597 8.9333 67 0.4249 0.7981
0.4606 10.0 75 0.4236 0.7981
0.4286 10.9333 82 0.4055 0.8462
0.3857 12.0 90 0.4144 0.8269
0.3857 12.9333 97 0.4294 0.7981
0.3801 14.0 105 0.4081 0.8462
0.3538 14.9333 112 0.4195 0.8462
0.3585 16.0 120 0.4069 0.8558
0.3585 16.9333 127 0.3971 0.8558
0.3258 18.0 135 0.3938 0.8654
0.3288 18.9333 142 0.3964 0.8462
0.3276 20.0 150 0.4423 0.8558
0.3276 20.9333 157 0.4067 0.8365
0.317 22.0 165 0.4179 0.8654
0.288 22.9333 172 0.3882 0.8558
0.2735 24.0 180 0.4215 0.8558
0.2735 24.9333 187 0.3972 0.8462
0.2805 26.0 195 0.3943 0.8558
0.2961 26.9333 202 0.3999 0.8558
0.2832 28.0 210 0.4043 0.8558

Framework versions

  • Transformers 4.41.2
  • Pytorch 2.3.0+cu121
  • Datasets 2.19.2
  • Tokenizers 0.19.1