--- library_name: transformers license: apache-2.0 base_model: facebook/wav2vec2-large-xlsr-53 tags: - generated_from_trainer metrics: - wer model-index: - name: xlsr-aiish-nomiii results: [] --- # xlsr-aiish-nomiii This model is a fine-tuned version of [facebook/wav2vec2-large-xlsr-53](https://huggingface.co/facebook/wav2vec2-large-xlsr-53) on the None dataset. It achieves the following results on the evaluation set: - Loss: 0.0000 - Wer: 0.3068 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - learning_rate: 0.0004 - train_batch_size: 8 - eval_batch_size: 8 - seed: 42 - gradient_accumulation_steps: 2 - total_train_batch_size: 16 - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08 - lr_scheduler_type: linear - lr_scheduler_warmup_steps: 132 - num_epochs: 100 - mixed_precision_training: Native AMP ### Training results | Training Loss | Epoch | Step | Validation Loss | Wer | |:-------------:|:-------:|:-----:|:---------------:|:------:| | 4.6675 | 1.6327 | 200 | 2.6799 | 1.0 | | 1.8836 | 3.2653 | 400 | 0.3203 | 0.6161 | | 0.3406 | 4.8980 | 600 | 0.0509 | 0.4144 | | 0.1344 | 6.5306 | 800 | 0.0307 | 0.3704 | | 0.0919 | 8.1633 | 1000 | 0.0077 | 0.3093 | | 0.0576 | 9.7959 | 1200 | 0.0027 | 0.3105 | | 0.0488 | 11.4286 | 1400 | 0.0023 | 0.3093 | | 0.0497 | 13.0612 | 1600 | 0.0157 | 0.3178 | | 0.039 | 14.6939 | 1800 | 0.0007 | 0.3068 | | 0.0338 | 16.3265 | 2000 | 0.0086 | 0.3105 | | 0.0347 | 17.9592 | 2200 | 0.0020 | 0.3081 | | 0.0259 | 19.5918 | 2400 | 0.0004 | 0.3081 | | 0.0254 | 21.2245 | 2600 | 0.0268 | 0.3227 | | 0.0321 | 22.8571 | 2800 | 0.0093 | 0.3142 | | 0.0255 | 24.4898 | 3000 | 0.0003 | 0.3105 | | 0.0222 | 26.1224 | 3200 | 0.0004 | 0.3068 | | 0.0203 | 27.7551 | 3400 | 0.0106 | 0.3142 | | 0.0207 | 29.3878 | 3600 | 0.0005 | 0.3068 | | 0.0177 | 31.0204 | 3800 | 0.0012 | 0.3068 | | 0.0143 | 32.6531 | 4000 | 0.0002 | 0.3068 | | 0.0181 | 34.2857 | 4200 | 0.0003 | 0.3068 | | 0.0142 | 35.9184 | 4400 | 0.0002 | 0.3068 | | 0.0141 | 37.5510 | 4600 | 0.0003 | 0.3068 | | 0.0117 | 39.1837 | 4800 | 0.0001 | 0.3068 | | 0.0115 | 40.8163 | 5000 | 0.0001 | 0.3081 | | 0.0107 | 42.4490 | 5200 | 0.0002 | 0.3068 | | 0.0093 | 44.0816 | 5400 | 0.0001 | 0.3068 | | 0.0111 | 45.7143 | 5600 | 0.0004 | 0.3068 | | 0.013 | 47.3469 | 5800 | 0.0001 | 0.3068 | | 0.0115 | 48.9796 | 6000 | 0.0004 | 0.3068 | | 0.0109 | 50.6122 | 6200 | 0.0001 | 0.3068 | | 0.0073 | 52.2449 | 6400 | 0.0001 | 0.3068 | | 0.0058 | 53.8776 | 6600 | 0.0001 | 0.3068 | | 0.0123 | 55.5102 | 6800 | 0.0001 | 0.3068 | | 0.0119 | 57.1429 | 7000 | 0.0008 | 0.3081 | | 0.0108 | 58.7755 | 7200 | 0.0001 | 0.3068 | | 0.0069 | 60.4082 | 7400 | 0.0001 | 0.3068 | | 0.0054 | 62.0408 | 7600 | 0.0009 | 0.3081 | | 0.0066 | 63.6735 | 7800 | 0.0001 | 0.3068 | | 0.0072 | 65.3061 | 8000 | 0.0001 | 0.3068 | | 0.0049 | 66.9388 | 8200 | 0.0008 | 0.3068 | | 0.0054 | 68.5714 | 8400 | 0.0001 | 0.3068 | | 0.0048 | 70.2041 | 8600 | 0.0001 | 0.3068 | | 0.0046 | 71.8367 | 8800 | 0.0000 | 0.3068 | | 0.0034 | 73.4694 | 9000 | 0.0021 | 0.3093 | | 0.0049 | 75.1020 | 9200 | 0.0000 | 0.3068 | | 0.0016 | 76.7347 | 9400 | 0.0000 | 0.3068 | | 0.0039 | 78.3673 | 9600 | 0.0000 | 0.3068 | | 0.0036 | 80.0 | 9800 | 0.0000 | 0.3068 | | 0.0036 | 81.6327 | 10000 | 0.0000 | 0.3068 | | 0.0022 | 83.2653 | 10200 | 0.0006 | 0.3068 | | 0.0033 | 84.8980 | 10400 | 0.0000 | 0.3081 | | 0.0027 | 86.5306 | 10600 | 0.0000 | 0.3068 | | 0.0018 | 88.1633 | 10800 | 0.0000 | 0.3068 | | 0.0028 | 89.7959 | 11000 | 0.0000 | 0.3068 | | 0.0019 | 91.4286 | 11200 | 0.0000 | 0.3068 | | 0.0023 | 93.0612 | 11400 | 0.0000 | 0.3068 | | 0.0012 | 94.6939 | 11600 | 0.0000 | 0.3068 | | 0.0017 | 96.3265 | 11800 | 0.0000 | 0.3068 | | 0.0014 | 97.9592 | 12000 | 0.0000 | 0.3068 | | 0.0011 | 99.5918 | 12200 | 0.0000 | 0.3068 | ### Framework versions - Transformers 4.46.0.dev0 - Pytorch 2.4.0 - Datasets 3.0.0 - Tokenizers 0.20.0