dhiya96 commited on
Commit
8abffda
1 Parent(s): c972764

End of training

Browse files
README.md CHANGED
@@ -17,11 +17,11 @@ should probably proofread and complete it, then remove this comment. -->
17
 
18
  This model is a fine-tuned version of [Falconsai/text_summarization](https://huggingface.co/Falconsai/text_summarization) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
- - Loss: 1.6704
21
- - Rouge1: 14.8468
22
- - Rouge2: 6.605
23
- - Rougel: 12.5912
24
- - Rougelsum: 13.8844
25
  - Gen Len: 19.0
26
 
27
  ## Model description
@@ -47,38 +47,53 @@ The following hyperparameters were used during training:
47
  - seed: 42
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
- - num_epochs: 25
51
  - mixed_precision_training: Native AMP
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
56
  |:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|:-------:|:---------:|:-------:|
57
- | No log | 1.0 | 102 | 2.0572 | 14.815 | 5.9056 | 12.0659 | 13.6223 | 19.0 |
58
- | No log | 2.0 | 204 | 1.9284 | 14.8263 | 6.0075 | 12.0672 | 13.5783 | 19.0 |
59
- | No log | 3.0 | 306 | 1.8635 | 14.7132 | 6.2969 | 12.1804 | 13.7033 | 19.0 |
60
- | No log | 4.0 | 408 | 1.8187 | 14.5759 | 6.3479 | 12.2351 | 13.5408 | 19.0 |
61
- | 2.2351 | 5.0 | 510 | 1.7896 | 14.6503 | 6.4258 | 12.2892 | 13.6634 | 19.0 |
62
- | 2.2351 | 6.0 | 612 | 1.7742 | 14.5794 | 6.3181 | 12.2248 | 13.5458 | 19.0 |
63
- | 2.2351 | 7.0 | 714 | 1.7523 | 14.5132 | 6.3226 | 12.2905 | 13.4683 | 19.0 |
64
- | 2.2351 | 8.0 | 816 | 1.7409 | 14.4054 | 6.301 | 12.1657 | 13.3541 | 19.0 |
65
- | 2.2351 | 9.0 | 918 | 1.7266 | 14.523 | 6.4309 | 12.2507 | 13.4937 | 19.0 |
66
- | 1.9331 | 10.0 | 1020 | 1.7176 | 14.6255 | 6.5518 | 12.2987 | 13.5785 | 19.0 |
67
- | 1.9331 | 11.0 | 1122 | 1.7080 | 14.7579 | 6.5473 | 12.3413 | 13.7116 | 19.0 |
68
- | 1.9331 | 12.0 | 1224 | 1.7026 | 14.744 | 6.6321 | 12.3666 | 13.7439 | 19.0 |
69
- | 1.9331 | 13.0 | 1326 | 1.6952 | 14.9263 | 6.7745 | 12.5911 | 13.9137 | 19.0 |
70
- | 1.9331 | 14.0 | 1428 | 1.6924 | 14.9758 | 6.8123 | 12.6647 | 14.0481 | 19.0 |
71
- | 1.8412 | 15.0 | 1530 | 1.6874 | 14.9901 | 6.7148 | 12.57 | 14.0264 | 19.0 |
72
- | 1.8412 | 16.0 | 1632 | 1.6838 | 14.9599 | 6.7418 | 12.55 | 14.0427 | 19.0 |
73
- | 1.8412 | 17.0 | 1734 | 1.6807 | 14.9124 | 6.7273 | 12.5752 | 13.9551 | 19.0 |
74
- | 1.8412 | 18.0 | 1836 | 1.6779 | 14.8536 | 6.7331 | 12.5783 | 13.9188 | 19.0 |
75
- | 1.8412 | 19.0 | 1938 | 1.6744 | 14.9394 | 6.7234 | 12.6105 | 13.9947 | 19.0 |
76
- | 1.7905 | 20.0 | 2040 | 1.6736 | 14.9112 | 6.6709 | 12.603 | 13.9438 | 19.0 |
77
- | 1.7905 | 21.0 | 2142 | 1.6724 | 14.9004 | 6.6578 | 12.6049 | 13.9428 | 19.0 |
78
- | 1.7905 | 22.0 | 2244 | 1.6724 | 14.7264 | 6.5678 | 12.5109 | 13.8099 | 19.0 |
79
- | 1.7905 | 23.0 | 2346 | 1.6713 | 14.7045 | 6.5554 | 12.4921 | 13.799 | 19.0 |
80
- | 1.7905 | 24.0 | 2448 | 1.6708 | 14.7174 | 6.5556 | 12.4934 | 13.8 | 19.0 |
81
- | 1.7682 | 25.0 | 2550 | 1.6704 | 14.8468 | 6.605 | 12.5912 | 13.8844 | 19.0 |
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
82
 
83
 
84
  ### Framework versions
 
17
 
18
  This model is a fine-tuned version of [Falconsai/text_summarization](https://huggingface.co/Falconsai/text_summarization) on an unknown dataset.
19
  It achieves the following results on the evaluation set:
20
+ - Loss: 1.6071
21
+ - Rouge1: 15.4764
22
+ - Rouge2: 7.3425
23
+ - Rougel: 13.0298
24
+ - Rougelsum: 14.3613
25
  - Gen Len: 19.0
26
 
27
  ## Model description
 
47
  - seed: 42
48
  - optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
49
  - lr_scheduler_type: linear
50
+ - num_epochs: 40
51
  - mixed_precision_training: Native AMP
52
 
53
  ### Training results
54
 
55
  | Training Loss | Epoch | Step | Validation Loss | Rouge1 | Rouge2 | Rougel | Rougelsum | Gen Len |
56
  |:-------------:|:-----:|:----:|:---------------:|:-------:|:------:|:-------:|:---------:|:-------:|
57
+ | No log | 1.0 | 102 | 1.5996 | 15.7162 | 7.3225 | 13.1679 | 14.5316 | 19.0 |
58
+ | No log | 2.0 | 204 | 1.5991 | 15.7364 | 7.3916 | 13.2205 | 14.5865 | 19.0 |
59
+ | No log | 3.0 | 306 | 1.5948 | 15.7337 | 7.4936 | 13.2031 | 14.5941 | 19.0 |
60
+ | No log | 4.0 | 408 | 1.5935 | 15.7661 | 7.4892 | 13.1138 | 14.5123 | 19.0 |
61
+ | 1.4093 | 5.0 | 510 | 1.5972 | 15.6328 | 7.2837 | 13.1138 | 14.4789 | 19.0 |
62
+ | 1.4093 | 6.0 | 612 | 1.6016 | 15.5382 | 7.3117 | 13.0203 | 14.3907 | 19.0 |
63
+ | 1.4093 | 7.0 | 714 | 1.5983 | 15.5582 | 7.2532 | 12.9421 | 14.3971 | 19.0 |
64
+ | 1.4093 | 8.0 | 816 | 1.6039 | 15.5287 | 7.3152 | 13.002 | 14.3652 | 19.0 |
65
+ | 1.4093 | 9.0 | 918 | 1.6016 | 15.5916 | 7.3367 | 13.0811 | 14.442 | 19.0 |
66
+ | 1.3525 | 10.0 | 1020 | 1.6017 | 15.749 | 7.6355 | 13.1754 | 14.6339 | 19.0 |
67
+ | 1.3525 | 11.0 | 1122 | 1.5992 | 15.6529 | 7.5216 | 13.1041 | 14.5668 | 19.0 |
68
+ | 1.3525 | 12.0 | 1224 | 1.5977 | 15.64 | 7.3843 | 13.0609 | 14.5366 | 19.0 |
69
+ | 1.3525 | 13.0 | 1326 | 1.5993 | 15.6516 | 7.4595 | 13.1143 | 14.5799 | 19.0 |
70
+ | 1.3525 | 14.0 | 1428 | 1.6040 | 15.6532 | 7.5787 | 13.0764 | 14.5464 | 19.0 |
71
+ | 1.3156 | 15.0 | 1530 | 1.5998 | 15.4999 | 7.349 | 13.016 | 14.4233 | 19.0 |
72
+ | 1.3156 | 16.0 | 1632 | 1.6039 | 15.4718 | 7.2392 | 12.9167 | 14.3196 | 19.0 |
73
+ | 1.3156 | 17.0 | 1734 | 1.6026 | 15.5434 | 7.376 | 12.9885 | 14.3673 | 19.0 |
74
+ | 1.3156 | 18.0 | 1836 | 1.6008 | 15.4092 | 7.2119 | 12.9495 | 14.286 | 19.0 |
75
+ | 1.3156 | 19.0 | 1938 | 1.6009 | 15.4604 | 7.4049 | 13.0264 | 14.3634 | 19.0 |
76
+ | 1.2849 | 20.0 | 2040 | 1.6028 | 15.4735 | 7.3749 | 12.9979 | 14.3637 | 19.0 |
77
+ | 1.2849 | 21.0 | 2142 | 1.6025 | 15.617 | 7.5495 | 13.0912 | 14.4945 | 19.0 |
78
+ | 1.2849 | 22.0 | 2244 | 1.6061 | 15.65 | 7.6043 | 13.119 | 14.5419 | 19.0 |
79
+ | 1.2849 | 23.0 | 2346 | 1.6039 | 15.5747 | 7.5283 | 13.0601 | 14.4706 | 19.0 |
80
+ | 1.2849 | 24.0 | 2448 | 1.6071 | 15.4923 | 7.4246 | 12.9747 | 14.3495 | 19.0 |
81
+ | 1.2625 | 25.0 | 2550 | 1.6030 | 15.5403 | 7.4373 | 13.1005 | 14.4791 | 19.0 |
82
+ | 1.2625 | 26.0 | 2652 | 1.6044 | 15.5232 | 7.4625 | 13.049 | 14.4455 | 19.0 |
83
+ | 1.2625 | 27.0 | 2754 | 1.6038 | 15.4961 | 7.4241 | 13.0409 | 14.4496 | 19.0 |
84
+ | 1.2625 | 28.0 | 2856 | 1.6048 | 15.5079 | 7.551 | 13.0814 | 14.4369 | 19.0 |
85
+ | 1.2625 | 29.0 | 2958 | 1.6067 | 15.4629 | 7.4087 | 13.0123 | 14.3897 | 19.0 |
86
+ | 1.2418 | 30.0 | 3060 | 1.6052 | 15.5104 | 7.518 | 13.0891 | 14.4284 | 19.0 |
87
+ | 1.2418 | 31.0 | 3162 | 1.6051 | 15.5104 | 7.4773 | 13.0686 | 14.4114 | 19.0 |
88
+ | 1.2418 | 32.0 | 3264 | 1.6044 | 15.5491 | 7.5342 | 13.1145 | 14.4742 | 19.0 |
89
+ | 1.2418 | 33.0 | 3366 | 1.6064 | 15.5321 | 7.4773 | 13.0686 | 14.4336 | 19.0 |
90
+ | 1.2418 | 34.0 | 3468 | 1.6055 | 15.5193 | 7.5178 | 13.0887 | 14.4521 | 19.0 |
91
+ | 1.2313 | 35.0 | 3570 | 1.6057 | 15.4739 | 7.4526 | 13.0326 | 14.3947 | 19.0 |
92
+ | 1.2313 | 36.0 | 3672 | 1.6057 | 15.4486 | 7.3244 | 12.9881 | 14.3346 | 19.0 |
93
+ | 1.2313 | 37.0 | 3774 | 1.6067 | 15.4764 | 7.3795 | 13.0402 | 14.3886 | 19.0 |
94
+ | 1.2313 | 38.0 | 3876 | 1.6072 | 15.4594 | 7.3028 | 12.9813 | 14.3339 | 19.0 |
95
+ | 1.2313 | 39.0 | 3978 | 1.6070 | 15.4764 | 7.3795 | 13.0402 | 14.3886 | 19.0 |
96
+ | 1.2274 | 40.0 | 4080 | 1.6071 | 15.4764 | 7.3425 | 13.0298 | 14.3613 | 19.0 |
97
 
98
 
99
  ### Framework versions
model.safetensors CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:47d41378207213f8b8b92b27c18aeb01c63a46ad4b3b1c806344c7162b497961
3
  size 242041896
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:38eeadac2c340b5f0bc0d308d576b85e0ff374c3fa89473b8c1ea029e36194b8
3
  size 242041896
runs/Mar11_10-48-01_4f6cf6339e10/events.out.tfevents.1710154091.4f6cf6339e10.656.1 CHANGED
@@ -1,3 +1,3 @@
1
  version https://git-lfs.github.com/spec/v1
2
- oid sha256:9035ee2888a84248fa1fe737eee55af9eb4acaa1b8b6ec1d48989ad3d85e8f22
3
- size 27837
 
1
  version https://git-lfs.github.com/spec/v1
2
+ oid sha256:03f5076a2551a0f1501f52dad8b01334fab0af8dedcad3303c85afe5f0c38ad1
3
+ size 28716