Edit model card

segformer-b0-scene-parse-150_epoch_100_230609

This model is a fine-tuned version of nvidia/mit-b0 on the scene_parse_150 dataset. It achieves the following results on the evaluation set:

  • Loss: 2.7126
  • Mean Iou: 0.1053
  • Mean Accuracy: 0.1994
  • Overall Accuracy: 0.5447
  • Per Category Iou: [0.48741983413436024, 0.34708122936068353, 0.8494644532893246, 0.3618389507826823, 0.016919144195669256, 0.746579767268802, 0.0, 0.4008814740204453, 0.26432782122527576, 0.0, 0.0, 0.2358305940560507, 0.13905866374131537, nan, 0.0, 0.0, 0.5318380393695908, 0.0, 0.0, 0.041298586572438165, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
  • Per Category Accuracy: [0.7865618692274757, 0.9652097859624402, 0.9908729919072352, 0.5594874236350619, 0.12989690721649486, 0.8943671630094044, nan, 0.8825049920983964, 0.29573472254593786, nan, 0.0, 0.9468519337392428, 0.16706413957574998, nan, 0.0, 0.0, 0.5378679869020947, 0.0, 0.0, 0.21969845310358332, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 6e-05
  • train_batch_size: 8
  • eval_batch_size: 8
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 100

Training results

Training Loss Epoch Step Validation Loss Mean Iou Mean Accuracy Overall Accuracy Per Category Iou Per Category Accuracy
3.335 20.0 100 3.5913 0.0958 0.1968 0.4914 [0.4372210968359756, 0.3028306951772656, 0.9033017061947888, 0.3690449269582307, 0.05890453885736904, 0.521817339647163, 0.0, 0.3631349261471501, 0.05912798485639358, nan, 0.0, 0.23295937758137303, 0.12080500701413618, 0.0, 0.0, 0.0, 0.40666846895557357, 0.0, 0.0, 0.15182824063896827, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.7828715313882603, 0.9935011297101626, 0.9796020050730765, 0.6524082439607645, 0.5454753722794959, 0.5683581504702194, nan, 0.7686116453711304, 0.05922631608786308, nan, 0.0, 0.9736725738970713, 0.14051713317434417, nan, 0.0, 0.0, 0.41243778756116795, 0.0, 0.0, 0.40571764245153713, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
2.2088 40.0 200 2.9755 0.1102 0.2011 0.5560 [0.45422192073986317, 0.3436668041953486, 0.8903445028964444, 0.36640300640210627, 0.08482177830003917, 0.696578291411738, 0.0, 0.3924824887368871, 0.1146148769912978, nan, 0.0, 0.2583488263193765, 0.09984717269485481, nan, 0.0, 0.0, 0.6613259967945657, 0.0, 0.0, 0.044113233970191054, 0.0, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.7754602727995596, 0.9612375792619227, 0.9850827394612875, 0.5917451364483808, 0.3307369224894998, 0.9034090909090909, nan, 0.8772617574636465, 0.11706677921472532, nan, 0.0, 0.9477023027994149, 0.11070666499309652, nan, 0.0, 0.0, 0.6691055509423911, 0.0, 0.0, 0.17270413158410025, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
1.8764 60.0 300 2.8496 0.1046 0.1910 0.5299 [0.44823292205691717, 0.3374611910810048, 0.8521673994463442, 0.36771300448430494, 0.011525925925925926, 0.6769752103220841, nan, 0.4127585356400409, 0.19237793012603657, nan, 0.0, 0.23536215301960003, 0.10166928075285682, nan, 0.0, 0.0, 0.502039728794969, 0.0, 0.0, 0.044836210577685595, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.00027059937762143147, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.7515036597549289, 0.9295206627632954, 0.9876570237951443, 0.5829066103598283, 0.08911798396334479, 0.9305789576802508, nan, 0.8610542821791275, 0.20239752562253144, nan, 0.0, 0.9655940678254362, 0.1139073678925568, nan, 0.0, 0.0, 0.5067709067740402, 0.0, 0.0, 0.14392010965341687, nan, nan, nan, 0.0, nan, nan, nan, 0.0002748763056624519, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
1.6882 80.0 400 2.6676 0.1123 0.2036 0.5699 [0.4826916675912571, 0.35289291208668705, 0.8613952449463594, 0.3690071358526864, 0.04114119410882794, 0.7420633159137224, 0.0, 0.39243581224605395, 0.26480929728158487, nan, 0.0, 0.242911210420564, 0.12443874278383579, nan, 0.0, 0.0, 0.6824408307674852, 0.0, 0.0, 0.04806344199088679, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.7805655799539217, 0.9613226112096402, 0.9898462978620607, 0.573010513921042, 0.1424971363115693, 0.9007004310344827, nan, 0.8826764997733648, 0.29841193849332587, nan, 0.0, 0.9540290486070955, 0.14610267352830425, nan, 0.0, 0.0, 0.6910372308479692, 0.0, 0.0, 0.21480321127863716, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]
1.9454 100.0 500 2.7126 0.1053 0.1994 0.5447 [0.48741983413436024, 0.34708122936068353, 0.8494644532893246, 0.3618389507826823, 0.016919144195669256, 0.746579767268802, 0.0, 0.4008814740204453, 0.26432782122527576, 0.0, 0.0, 0.2358305940560507, 0.13905866374131537, nan, 0.0, 0.0, 0.5318380393695908, 0.0, 0.0, 0.041298586572438165, 0.0, nan, nan, 0.0, nan, 0.0, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan] [0.7865618692274757, 0.9652097859624402, 0.9908729919072352, 0.5594874236350619, 0.12989690721649486, 0.8943671630094044, nan, 0.8825049920983964, 0.29573472254593786, nan, 0.0, 0.9468519337392428, 0.16706413957574998, nan, 0.0, 0.0, 0.5378679869020947, 0.0, 0.0, 0.21969845310358332, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, 0.0, nan, 0.0, nan, nan, nan, 0.0, nan, 0.0, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, 0.0, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, 0.0, nan, nan, nan, nan, nan, 0.0, nan, nan, 0.0, nan, nan, 0.0, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan, nan]

Framework versions

  • Transformers 4.29.2
  • Pytorch 2.0.1
  • Datasets 2.12.0
  • Tokenizers 0.13.3
Downloads last month
2
Inference API
Unable to determine this model’s pipeline type. Check the docs .

Dataset used to train manadopeee/segformer-b0-scene-parse-150_epoch_100_230609