File size: 5,371 Bytes
207a01d 330a2bd 207a01d 5c6e27b 2ea4ecf 7ac79cc 887ea9c 5708eae 58aea77 2349be9 6767792 f415825 f64467d a0b4420 fc4011c c1c4120 ce3dad7 81de916 030f93c b8d4af9 171b6e2 6912354 52698c2 101e043 ae8c462 618bc24 44dc739 a0889f3 935ebc7 1e05099 c376bfc b720156 8cd08c0 379d74b 5bc97cf 8f0329d 5a5e16e 05630c1 15a8fd8 6428c29 dfe2b94 ecc9cd9 7416ceb c014107 bf0c13d 9cbacaf fa5ac49 acd351f 1614f0f 364e2cf a9297aa 787bc09 d683e76 c85aac6 3001006 a7634b9 2f4b0a7 0cbdb49 29dc8a9 6b25e2a 8bbddd5 e24a386 9fbe336 5342038 ace196d 9ef1bbe 931a07b b7066f1 167e527 7b542a0 4b5db38 24c46ad 1c62389 729477c 3b6a810 c1bd3b8 965229d b2520c5 7fd9d03 163d0b7 285cdc0 94394b2 fb3cc21 10513f3 4eba8db 86a186f 164c69c 6ecb3ed 4fb95df a604666 e84a796 f0cd116 0f6361e 78039fe dee8d12 6f8e8c7 a0a05a6 5ed32b3 4d74733 6ccc9b8 c380677 bff1053 330a2bd 207a01d |
1 2 3 4 5 6 7 8 9 10 11 12 13 14 15 16 17 18 19 20 21 22 23 24 25 26 27 28 29 30 31 32 33 34 35 36 37 38 39 40 41 42 43 44 45 46 47 48 49 50 51 52 53 54 55 56 57 58 59 60 61 62 63 64 65 66 67 68 69 70 71 72 73 74 75 76 77 78 79 80 81 82 83 84 85 86 87 88 89 90 91 92 93 94 95 96 97 98 99 100 101 102 103 104 105 106 107 108 109 110 111 112 113 114 115 116 117 118 119 120 121 122 123 124 125 126 127 128 129 130 131 132 133 134 135 136 137 138 139 140 141 142 143 144 145 146 147 148 149 150 151 152 153 154 |
---
license: apache-2.0
base_model: t5-small
tags:
- generated_from_keras_callback
model-index:
- name: pijarcandra22/t5Indo2Sunda
results: []
---
<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->
# pijarcandra22/t5Indo2Sunda
This model is a fine-tuned version of [t5-small](https://huggingface.co/t5-small) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 2.1941
- Validation Loss: 2.1230
- Epoch: 99
## Model description
More information needed
## Intended uses & limitations
More information needed
## Training and evaluation data
More information needed
## Training procedure
### Training hyperparameters
The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32
### Training results
| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 4.3724 | 3.9124 | 0 |
| 3.9887 | 3.6348 | 1 |
| 3.7534 | 3.4215 | 2 |
| 3.5819 | 3.2847 | 3 |
| 3.4632 | 3.1902 | 4 |
| 3.3751 | 3.1139 | 5 |
| 3.3039 | 3.0493 | 6 |
| 3.2447 | 2.9955 | 7 |
| 3.1911 | 2.9481 | 8 |
| 3.1455 | 2.9082 | 9 |
| 3.1068 | 2.8718 | 10 |
| 3.0697 | 2.8387 | 11 |
| 3.0381 | 2.8105 | 12 |
| 3.0050 | 2.7825 | 13 |
| 2.9796 | 2.7568 | 14 |
| 2.9510 | 2.7350 | 15 |
| 2.9259 | 2.7096 | 16 |
| 2.9053 | 2.6881 | 17 |
| 2.8833 | 2.6696 | 18 |
| 2.8599 | 2.6510 | 19 |
| 2.8403 | 2.6328 | 20 |
| 2.8207 | 2.6171 | 21 |
| 2.8046 | 2.5999 | 22 |
| 2.7861 | 2.5857 | 23 |
| 2.7715 | 2.5699 | 24 |
| 2.7557 | 2.5542 | 25 |
| 2.7387 | 2.5420 | 26 |
| 2.7225 | 2.5299 | 27 |
| 2.7085 | 2.5182 | 28 |
| 2.6950 | 2.5081 | 29 |
| 2.6818 | 2.4951 | 30 |
| 2.6687 | 2.4864 | 31 |
| 2.6578 | 2.4760 | 32 |
| 2.6461 | 2.4651 | 33 |
| 2.6334 | 2.4559 | 34 |
| 2.6213 | 2.4477 | 35 |
| 2.6096 | 2.4373 | 36 |
| 2.5993 | 2.4297 | 37 |
| 2.5906 | 2.4208 | 38 |
| 2.5778 | 2.4100 | 39 |
| 2.5703 | 2.4025 | 40 |
| 2.5594 | 2.3962 | 41 |
| 2.5521 | 2.3901 | 42 |
| 2.5414 | 2.3808 | 43 |
| 2.5318 | 2.3726 | 44 |
| 2.5235 | 2.3684 | 45 |
| 2.5165 | 2.3592 | 46 |
| 2.5060 | 2.3507 | 47 |
| 2.4972 | 2.3466 | 48 |
| 2.4892 | 2.3388 | 49 |
| 2.4807 | 2.3325 | 50 |
| 2.4732 | 2.3281 | 51 |
| 2.4654 | 2.3210 | 52 |
| 2.4592 | 2.3138 | 53 |
| 2.4525 | 2.3100 | 54 |
| 2.4439 | 2.3046 | 55 |
| 2.4349 | 2.2980 | 56 |
| 2.4283 | 2.2926 | 57 |
| 2.4222 | 2.2884 | 58 |
| 2.4139 | 2.2824 | 59 |
| 2.4071 | 2.2759 | 60 |
| 2.4008 | 2.2705 | 61 |
| 2.3941 | 2.2664 | 62 |
| 2.3882 | 2.2588 | 63 |
| 2.3813 | 2.2566 | 64 |
| 2.3759 | 2.2498 | 65 |
| 2.3674 | 2.2461 | 66 |
| 2.3618 | 2.2425 | 67 |
| 2.3534 | 2.2377 | 68 |
| 2.3522 | 2.2314 | 69 |
| 2.3398 | 2.2269 | 70 |
| 2.3391 | 2.2241 | 71 |
| 2.3303 | 2.2184 | 72 |
| 2.3275 | 2.2137 | 73 |
| 2.3190 | 2.2100 | 74 |
| 2.3159 | 2.2048 | 75 |
| 2.3078 | 2.2011 | 76 |
| 2.3048 | 2.1971 | 77 |
| 2.3005 | 2.1936 | 78 |
| 2.2938 | 2.1899 | 79 |
| 2.2892 | 2.1859 | 80 |
| 2.2824 | 2.1819 | 81 |
| 2.2758 | 2.1787 | 82 |
| 2.2739 | 2.1757 | 83 |
| 2.2689 | 2.1716 | 84 |
| 2.2623 | 2.1664 | 85 |
| 2.2574 | 2.1657 | 86 |
| 2.2537 | 2.1618 | 87 |
| 2.2483 | 2.1563 | 88 |
| 2.2407 | 2.1554 | 89 |
| 2.2387 | 2.1510 | 90 |
| 2.2351 | 2.1469 | 91 |
| 2.2286 | 2.1436 | 92 |
| 2.2226 | 2.1413 | 93 |
| 2.2171 | 2.1395 | 94 |
| 2.2159 | 2.1342 | 95 |
| 2.2109 | 2.1314 | 96 |
| 2.2041 | 2.1284 | 97 |
| 2.1999 | 2.1260 | 98 |
| 2.1941 | 2.1230 | 99 |
### Framework versions
- Transformers 4.35.2
- TensorFlow 2.14.0
- Datasets 2.15.0
- Tokenizers 0.15.0
|