murat_chem_translation_model
This model is a fine-tuned version of Helsinki-NLP/opus-mt-de-en on the None dataset. It achieves the following results on the evaluation set:
- Loss: 1.2397
- Bleu: 46.8627
- Rouge: {'rouge1': 0.7695104784598361, 'rouge2': 0.5863697996443074, 'rougeL': 0.7478497975171696, 'rougeLsum': 0.7479448985343391}
Model description
More information needed
Intended uses & limitations
More information needed
Training and evaluation data
More information needed
Training procedure
Training hyperparameters
The following hyperparameters were used during training:
- learning_rate: 5e-05
- train_batch_size: 8
- eval_batch_size: 8
- seed: 42
- gradient_accumulation_steps: 2
- total_train_batch_size: 16
- optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
- lr_scheduler_type: linear
- num_epochs: 6
- mixed_precision_training: Native AMP
Training results
Training Loss | Epoch | Step | Validation Loss | Bleu | Rouge |
---|---|---|---|---|---|
1.6658 | 0.9882 | 42 | 1.1978 | 45.7366 | {'rouge1': 0.7467289256684114, 'rouge2': 0.5571647898288651, 'rougeL': 0.7252847750618231, 'rougeLsum': 0.7252328383526336} |
0.8913 | 2.0 | 85 | 1.1930 | 47.3050 | {'rouge1': 0.7661425108412243, 'rouge2': 0.5837435255422405, 'rougeL': 0.7453049514994822, 'rougeLsum': 0.7453292458394836} |
0.8207 | 2.9882 | 127 | 1.2109 | 46.9471 | {'rouge1': 0.7698964789465585, 'rouge2': 0.5885563783228516, 'rougeL': 0.7486852327603906, 'rougeLsum': 0.7486604897820754} |
0.6947 | 4.0 | 170 | 1.2213 | 46.8747 | {'rouge1': 0.768211392698957, 'rouge2': 0.5849757545893366, 'rougeL': 0.7464201059797938, 'rougeLsum': 0.7464705647035457} |
0.5649 | 4.9882 | 212 | 1.2328 | 46.9369 | {'rouge1': 0.7675442716234215, 'rouge2': 0.5865240030083476, 'rougeL': 0.7464873416227668, 'rougeLsum': 0.7464176283193402} |
0.4656 | 5.9294 | 252 | 1.2397 | 46.8627 | {'rouge1': 0.7695104784598361, 'rouge2': 0.5863697996443074, 'rougeL': 0.7478497975171696, 'rougeLsum': 0.7479448985343391} |
Framework versions
- Transformers 4.43.3
- Pytorch 2.3.0+cu121
- Datasets 2.20.0
- Tokenizers 0.19.1
- Downloads last month
- 21
Model tree for muratti18462/murat_chem_translation_model
Base model
Helsinki-NLP/opus-mt-de-en