Pamzyy commited on
Commit
76155fe
1 Parent(s): 36f81b1

Model save

Browse files
Files changed (1) hide show
  1. README.md +23 -6
README.md CHANGED
@@ -15,6 +15,8 @@ should probably proofread and complete it, then remove this comment. -->
15
  # sinhala_gpt2
16
 
17
  This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on the None dataset.
 
 
18
 
19
  ## Model description
20
 
@@ -33,19 +35,34 @@ More information needed
33
  ### Training hyperparameters
34
 
35
  The following hyperparameters were used during training:
36
- - learning_rate: 5e-05
37
- - train_batch_size: 64
38
- - eval_batch_size: 64
39
  - seed: 42
40
  - gradient_accumulation_steps: 4
41
- - total_train_batch_size: 256
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: cosine
44
- - lr_scheduler_warmup_steps: 500
45
- - num_epochs: 5
46
 
47
  ### Training results
48
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
49
 
50
 
51
  ### Framework versions
 
15
  # sinhala_gpt2
16
 
17
  This model is a fine-tuned version of [gpt2](https://huggingface.co/gpt2) on the None dataset.
18
+ It achieves the following results on the evaluation set:
19
+ - Loss: 3.4181
20
 
21
  ## Model description
22
 
 
35
  ### Training hyperparameters
36
 
37
  The following hyperparameters were used during training:
38
+ - learning_rate: 5e-06
39
+ - train_batch_size: 16
40
+ - eval_batch_size: 16
41
  - seed: 42
42
  - gradient_accumulation_steps: 4
43
+ - total_train_batch_size: 64
44
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
45
  - lr_scheduler_type: cosine
46
+ - lr_scheduler_warmup_steps: 10
47
+ - num_epochs: 1
48
 
49
  ### Training results
50
 
51
+ | Training Loss | Epoch | Step | Validation Loss |
52
+ |:-------------:|:------:|:----:|:---------------:|
53
+ | 12.5768 | 0.0737 | 20 | 11.7031 |
54
+ | 10.6016 | 0.1475 | 40 | 10.1428 |
55
+ | 9.5592 | 0.2212 | 60 | 8.4000 |
56
+ | 7.7086 | 0.2949 | 80 | 6.1398 |
57
+ | 6.1288 | 0.3687 | 100 | 5.1259 |
58
+ | 5.2551 | 0.4424 | 120 | 4.4283 |
59
+ | 4.7127 | 0.5161 | 140 | 4.0241 |
60
+ | 4.3572 | 0.5899 | 160 | 3.7673 |
61
+ | 4.1243 | 0.6636 | 180 | 3.6012 |
62
+ | 3.9714 | 0.7373 | 200 | 3.5126 |
63
+ | 3.8867 | 0.8111 | 220 | 3.4489 |
64
+ | 3.8334 | 0.8848 | 240 | 3.4256 |
65
+ | 3.8204 | 0.9585 | 260 | 3.4181 |
66
 
67
 
68
  ### Framework versions