whisper-tiny-khmer-aug-v6
This model is a fine-tuned version of openai/whisper-tiny on an unknown dataset. It achieves the following results on the evaluation set:
- Loss: 0.2583
- Wer: 65.9478
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 0.0001
- train_batch_size: 16
- eval_batch_size: 16
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 32
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: constant
- lr_scheduler_warmup_steps: 1000
- num_epochs: 10
Training results
Training Loss | Epoch | Step | Validation Loss | Wer |
---|---|---|---|---|
0.7761 | 0.9994 | 837 | 0.3819 | 88.5358 |
0.3222 | 2.0 | 1675 | 0.2888 | 79.8768 |
0.2456 | 2.9994 | 2512 | 0.2535 | 73.9419 |
0.2058 | 4.0 | 3350 | 0.2438 | 73.8447 |
0.1777 | 4.9994 | 4187 | 0.2391 | 69.8070 |
0.1577 | 6.0 | 5025 | 0.2397 | 68.3152 |
0.1421 | 6.9994 | 5862 | 0.2383 | 68.9801 |
0.1283 | 8.0 | 6700 | 0.2418 | 67.2612 |
0.1163 | 8.9994 | 7537 | 0.2475 | 69.7097 |
0.1052 | 9.9940 | 8370 | 0.2583 | 65.9478 |
Framework versions
- Transformers 4.44.0
- Pytorch 2.3.1
- Datasets 2.21.0
- Tokenizers 0.19.1
- Downloads last month
- 4
Model tree for rinabuoy/whisper-tiny-khmer-aug-v6
Base model
openai/whisper-tiny