--- license: apache-2.0 base_model: facebook/bart-base tags: - generated_from_keras_callback model-index: - name: pijarcandra22/NMTIndoBaliBART results: [] --- # pijarcandra22/NMTIndoBaliBART This model is a fine-tuned version of [facebook/bart-base](https://huggingface.co/facebook/bart-base) on an unknown dataset. It achieves the following results on the evaluation set: - Train Loss: 5.5373 - Validation Loss: 5.5667 - Epoch: 38 ## Model description More information needed ## Intended uses & limitations More information needed ## Training and evaluation data More information needed ## Training procedure ### Training hyperparameters The following hyperparameters were used during training: - optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 0.02, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01} - training_precision: float32 ### Training results | Train Loss | Validation Loss | Epoch | |:----------:|:---------------:|:-----:| | 9.7885 | 5.6003 | 0 | | 5.5737 | 5.5523 | 1 | | 5.5346 | 5.5361 | 2 | | 5.5189 | 5.5283 | 3 | | 5.5149 | 5.5252 | 4 | | 5.5123 | 5.5233 | 5 | | 5.5116 | 5.5485 | 6 | | 5.5095 | 5.5314 | 7 | | 5.5120 | 5.5569 | 8 | | 5.5137 | 5.5239 | 9 | | 5.5170 | 5.5289 | 10 | | 5.5180 | 5.5298 | 11 | | 5.5217 | 5.5513 | 12 | | 5.5219 | 5.5344 | 13 | | 5.5248 | 5.5366 | 14 | | 5.5268 | 5.5493 | 15 | | 5.5260 | 5.5313 | 16 | | 5.5290 | 5.5462 | 17 | | 5.5299 | 5.5570 | 18 | | 5.5293 | 5.5480 | 19 | | 5.5378 | 5.5524 | 20 | | 5.5317 | 5.5740 | 21 | | 5.5328 | 5.5543 | 22 | | 5.5327 | 5.5537 | 23 | | 5.5330 | 5.5356 | 24 | | 5.5304 | 5.5492 | 25 | | 5.5355 | 5.5388 | 26 | | 5.5337 | 5.5812 | 27 | | 5.5355 | 5.5598 | 28 | | 5.5348 | 5.5489 | 29 | | 5.5373 | 5.5526 | 30 | | 5.5357 | 5.5575 | 31 | | 5.5377 | 5.5439 | 32 | | 5.5404 | 5.5367 | 33 | | 5.5383 | 5.5819 | 34 | | 5.5359 | 5.5815 | 35 | | 5.5370 | 5.5499 | 36 | | 5.5340 | 5.5622 | 37 | | 5.5373 | 5.5667 | 38 | ### Framework versions - Transformers 4.40.2 - TensorFlow 2.15.0 - Datasets 2.19.1 - Tokenizers 0.19.1