santhoshmlops commited on
Commit
d2e406b
1 Parent(s): 49f47d3

End of training

Browse files
Files changed (1) hide show
  1. README.md +4 -3
README.md CHANGED
@@ -36,13 +36,14 @@ More information needed
36
 
37
  The following hyperparameters were used during training:
38
  - learning_rate: 0.0002
39
- - train_batch_size: 5
40
  - eval_batch_size: 8
41
  - seed: 42
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: cosine
44
  - lr_scheduler_warmup_ratio: 0.03
45
- - training_steps: 500
 
46
 
47
  ### Training results
48
 
@@ -52,6 +53,6 @@ The following hyperparameters were used during training:
52
 
53
  - PEFT 0.9.0
54
  - Transformers 4.38.2
55
- - Pytorch 2.1.0+cu121
56
  - Datasets 2.18.0
57
  - Tokenizers 0.15.2
 
36
 
37
  The following hyperparameters were used during training:
38
  - learning_rate: 0.0002
39
+ - train_batch_size: 1
40
  - eval_batch_size: 8
41
  - seed: 42
42
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
43
  - lr_scheduler_type: cosine
44
  - lr_scheduler_warmup_ratio: 0.03
45
+ - training_steps: 100
46
+ - mixed_precision_training: Native AMP
47
 
48
  ### Training results
49
 
 
53
 
54
  - PEFT 0.9.0
55
  - Transformers 4.38.2
56
+ - Pytorch 2.2.1+cu121
57
  - Datasets 2.18.0
58
  - Tokenizers 0.15.2