File size: 2,911 Bytes
207a01d 7416ceb 207a01d 5c6e27b 2ea4ecf 7ac79cc 887ea9c 5708eae 58aea77 2349be9 6767792 f415825 f64467d a0b4420 fc4011c c1c4120 ce3dad7 81de916 030f93c b8d4af9 171b6e2 6912354 52698c2 101e043 ae8c462 618bc24 44dc739 a0889f3 935ebc7 1e05099 c376bfc b720156 8cd08c0 379d74b 5bc97cf 8f0329d 5a5e16e 05630c1 15a8fd8 6428c29 dfe2b94 ecc9cd9 7416ceb 207a01d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 |
---
license: apache-2.0
base_model: t5-small
tags:
- generated_from_keras_callback
model-index:
- name: pijarcandra22/t5Indo2Sunda
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# pijarcandra22/t5Indo2Sunda
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 2.5778
- Validation Loss: 2.4100
- Epoch: 39
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 4.3724 | 3.9124 | 0 |
| 3.9887 | 3.6348 | 1 |
| 3.7534 | 3.4215 | 2 |
| 3.5819 | 3.2847 | 3 |
| 3.4632 | 3.1902 | 4 |
| 3.3751 | 3.1139 | 5 |
| 3.3039 | 3.0493 | 6 |
| 3.2447 | 2.9955 | 7 |
| 3.1911 | 2.9481 | 8 |
| 3.1455 | 2.9082 | 9 |
| 3.1068 | 2.8718 | 10 |
| 3.0697 | 2.8387 | 11 |
| 3.0381 | 2.8105 | 12 |
| 3.0050 | 2.7825 | 13 |
| 2.9796 | 2.7568 | 14 |
| 2.9510 | 2.7350 | 15 |
| 2.9259 | 2.7096 | 16 |
| 2.9053 | 2.6881 | 17 |
| 2.8833 | 2.6696 | 18 |
| 2.8599 | 2.6510 | 19 |
| 2.8403 | 2.6328 | 20 |
| 2.8207 | 2.6171 | 21 |
| 2.8046 | 2.5999 | 22 |
| 2.7861 | 2.5857 | 23 |
| 2.7715 | 2.5699 | 24 |
| 2.7557 | 2.5542 | 25 |
| 2.7387 | 2.5420 | 26 |
| 2.7225 | 2.5299 | 27 |
| 2.7085 | 2.5182 | 28 |
| 2.6950 | 2.5081 | 29 |
| 2.6818 | 2.4951 | 30 |
| 2.6687 | 2.4864 | 31 |
| 2.6578 | 2.4760 | 32 |
| 2.6461 | 2.4651 | 33 |
| 2.6334 | 2.4559 | 34 |
| 2.6213 | 2.4477 | 35 |
| 2.6096 | 2.4373 | 36 |
| 2.5993 | 2.4297 | 37 |
| 2.5906 | 2.4208 | 38 |
| 2.5778 | 2.4100 | 39 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.14.0
- Datasets 2.15.0
- Tokenizers 0.15.0
|