simecek's picture
update model card README.md
3aafc68
metadata
tags:
  - generated_from_trainer
model-index:
  - name: DNADebertaK6_Fruitfly
    results: []

DNADebertaK6_Fruitfly

This model is a fine-tuned version of on the None dataset. It achieves the following results on the evaluation set:

  • Loss: 1.7137

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 64
  • eval_batch_size: 64
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • training_steps: 600001
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss
4.5584 5.36 20000 1.9795
1.9682 10.73 40000 1.8618
1.8692 16.09 60000 1.8273
1.8339 21.45 80000 1.8076
1.8208 26.82 100000 1.8073
1.8105 32.18 120000 1.7925
1.8022 37.54 140000 1.7909
1.7955 42.91 160000 1.7836
1.7907 48.27 180000 1.7769
1.7849 53.63 200000 1.7755
1.7805 59.0 220000 1.7677
1.7769 64.36 240000 1.7690
1.7723 69.72 260000 1.7614
1.7689 75.09 280000 1.7586
1.7646 80.45 300000 1.7523
1.7607 85.81 320000 1.7484
1.7572 91.18 340000 1.7458
1.754 96.54 360000 1.7460
1.7498 101.9 380000 1.7326
1.7463 107.27 400000 1.7377
1.7438 112.63 420000 1.7318
1.7406 117.99 440000 1.7342
1.7383 123.36 460000 1.7339
1.7348 128.72 480000 1.7244
1.7324 134.08 500000 1.7236
1.7289 139.45 520000 1.7155
1.7268 144.81 540000 1.7254
1.725 150.17 560000 1.7191
1.7221 155.54 580000 1.7147
1.7209 160.9 600000 1.7137

Framework versions

  • Transformers 4.19.2
  • Pytorch 1.11.0
  • Datasets 2.2.2
  • Tokenizers 0.12.1