File size: 4,925 Bytes
823417e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
d62d18e
 
 
823417e
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
0ece0f6
6eced0a
3de4df9
03e2f88
2f1ad6e
f41d0ca
b079b15
cec6b73
a1bcbf3
f6ac371
4d27d82
14704b3
4243063
dafa0b1
110f89b
4e8bee3
f420530
0238bd7
bf23b6e
cd6dfff
4fbd767
ef263f7
c5d12be
c2b1c1d
17c57ed
e2e8521
421bd46
bab59cd
969ab6c
5b8a5d3
79aa0c0
e8d1f14
1fa3710
6b74d35
ccfd1fa
4a43808
0145317
801608d
dedb724
f79edf6
bb4ec6f
fc6b763
92b7f9e
183b7cb
a7cdfa3
1a02d2c
c5d89cc
ee9c4e2
e08f97b
0d87de4
ce54abf
e639d59
5ba8e3b
df5e18a
34a063c
8742c66
2df4e1b
2877d42
36722a1
a3c06c6
d009ad6
a3cb4da
f24125d
8754070
b22ce34
9b28ed8
8b350df
6e91c83
6b55b0e
a36ff4c
f48a113
37c86e6
3b8529c
92c300e
23d8f17
9445a19
904f72e
feca162
c91446f
8a8c2e3
bbf286d
19fd8fa
90ccd6f
54ecfe6
350de6f
47c0f0d
0304288
d62d18e
823417e
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
---
license: apache-2.0
base_model: t5-base
tags:
- generated_from_keras_callback
model-index:
- name: CapitainData/dyu-fr-t5-base_v3
  results: []
---

<!-- This model card has been generated automatically according to the information Keras had access to. You should
probably proofread and complete it, then remove this comment. -->

# CapitainData/dyu-fr-t5-base_v3

This model is a fine-tuned version of [t5-base](https://huggingface.co/t5-base) on an unknown dataset.
It achieves the following results on the evaluation set:
- Train Loss: 0.6666
- Validation Loss: 3.0327
- Epoch: 88

## Model description

More information needed

## Intended uses & limitations

More information needed

## Training and evaluation data

More information needed

## Training procedure

### Training hyperparameters

The following hyperparameters were used during training:
- optimizer: {'name': 'AdamWeightDecay', 'learning_rate': 2e-05, 'decay': 0.0, 'beta_1': 0.9, 'beta_2': 0.999, 'epsilon': 1e-07, 'amsgrad': False, 'weight_decay_rate': 0.01}
- training_precision: float32

### Training results

| Train Loss | Validation Loss | Epoch |
|:----------:|:---------------:|:-----:|
| 3.3233     | 2.8819          | 0     |
| 3.0679     | 2.7736          | 1     |
| 2.9557     | 2.7031          | 2     |
| 2.8537     | 2.6517          | 3     |
| 2.7672     | 2.6141          | 4     |
| 2.6959     | 2.5790          | 5     |
| 2.6234     | 2.5559          | 6     |
| 2.5663     | 2.5288          | 7     |
| 2.5025     | 2.5099          | 8     |
| 2.4535     | 2.4976          | 9     |
| 2.3996     | 2.4791          | 10    |
| 2.3570     | 2.4646          | 11    |
| 2.3096     | 2.4504          | 12    |
| 2.2604     | 2.4454          | 13    |
| 2.2212     | 2.4427          | 14    |
| 2.1817     | 2.4356          | 15    |
| 2.1437     | 2.4339          | 16    |
| 2.1022     | 2.4223          | 17    |
| 2.0667     | 2.4204          | 18    |
| 2.0382     | 2.4182          | 19    |
| 1.9938     | 2.4242          | 20    |
| 1.9631     | 2.4265          | 21    |
| 1.9289     | 2.4125          | 22    |
| 1.8995     | 2.4177          | 23    |
| 1.8716     | 2.4195          | 24    |
| 1.8402     | 2.4214          | 25    |
| 1.8068     | 2.4280          | 26    |
| 1.7809     | 2.4226          | 27    |
| 1.7446     | 2.4455          | 28    |
| 1.7253     | 2.4453          | 29    |
| 1.6978     | 2.4497          | 30    |
| 1.6735     | 2.4501          | 31    |
| 1.6427     | 2.4633          | 32    |
| 1.6168     | 2.4633          | 33    |
| 1.5921     | 2.4670          | 34    |
| 1.5688     | 2.4659          | 35    |
| 1.5417     | 2.4874          | 36    |
| 1.5189     | 2.4790          | 37    |
| 1.4963     | 2.4961          | 38    |
| 1.4715     | 2.4951          | 39    |
| 1.4486     | 2.5063          | 40    |
| 1.4263     | 2.5078          | 41    |
| 1.4068     | 2.5306          | 42    |
| 1.3814     | 2.5477          | 43    |
| 1.3645     | 2.5501          | 44    |
| 1.3394     | 2.5548          | 45    |
| 1.3223     | 2.5493          | 46    |
| 1.3060     | 2.5572          | 47    |
| 1.2850     | 2.6033          | 48    |
| 1.2566     | 2.5900          | 49    |
| 1.2426     | 2.6090          | 50    |
| 1.2266     | 2.6152          | 51    |
| 1.2067     | 2.6252          | 52    |
| 1.1842     | 2.6435          | 53    |
| 1.1680     | 2.6481          | 54    |
| 1.1476     | 2.6438          | 55    |
| 1.1295     | 2.6559          | 56    |
| 1.1128     | 2.6910          | 57    |
| 1.1000     | 2.6722          | 58    |
| 1.0787     | 2.6840          | 59    |
| 1.0636     | 2.7139          | 60    |
| 1.0425     | 2.7218          | 61    |
| 1.0298     | 2.7196          | 62    |
| 1.0150     | 2.7374          | 63    |
| 0.9989     | 2.7367          | 64    |
| 0.9811     | 2.7660          | 65    |
| 0.9674     | 2.7741          | 66    |
| 0.9490     | 2.7701          | 67    |
| 0.9322     | 2.7856          | 68    |
| 0.9197     | 2.7829          | 69    |
| 0.9010     | 2.8053          | 70    |
| 0.8894     | 2.8119          | 71    |
| 0.8732     | 2.8408          | 72    |
| 0.8597     | 2.8401          | 73    |
| 0.8404     | 2.8706          | 74    |
| 0.8317     | 2.8872          | 75    |
| 0.8204     | 2.8772          | 76    |
| 0.8083     | 2.8962          | 77    |
| 0.7905     | 2.9103          | 78    |
| 0.7825     | 2.9111          | 79    |
| 0.7659     | 2.9394          | 80    |
| 0.7486     | 2.9496          | 81    |
| 0.7359     | 2.9663          | 82    |
| 0.7250     | 2.9775          | 83    |
| 0.7133     | 2.9877          | 84    |
| 0.7035     | 2.9884          | 85    |
| 0.6912     | 2.9902          | 86    |
| 0.6762     | 3.0133          | 87    |
| 0.6666     | 3.0327          | 88    |


### Framework versions

- Transformers 4.38.2
- TensorFlow 2.16.1
- Datasets 2.18.0
- Tokenizers 0.15.2