dhiya96's picture
End of training
1532529 verified
|
raw
history blame
4.12 kB
metadata
license: apache-2.0
base_model: Falconsai/text_summarization
tags:
  - generated_from_trainer
metrics:
  - rouge
model-index:
  - name: text_summarization-finetuned-stocknews_1900_100
    results: []

text_summarization-finetuned-stocknews_1900_100

This model is a fine-tuned version of Falconsai/text_summarization on an unknown dataset. It achieves the following results on the evaluation set:

  • Loss: 1.6704
  • Rouge1: 14.8468
  • Rouge2: 6.605
  • Rougel: 12.5912
  • Rougelsum: 13.8844
  • Gen Len: 19.0

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 2e-05
  • train_batch_size: 4
  • eval_batch_size: 4
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 25
  • mixed_precision_training: Native AMP

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
No log 1.0 102 2.0572 14.815 5.9056 12.0659 13.6223 19.0
No log 2.0 204 1.9284 14.8263 6.0075 12.0672 13.5783 19.0
No log 3.0 306 1.8635 14.7132 6.2969 12.1804 13.7033 19.0
No log 4.0 408 1.8187 14.5759 6.3479 12.2351 13.5408 19.0
2.2351 5.0 510 1.7896 14.6503 6.4258 12.2892 13.6634 19.0
2.2351 6.0 612 1.7742 14.5794 6.3181 12.2248 13.5458 19.0
2.2351 7.0 714 1.7523 14.5132 6.3226 12.2905 13.4683 19.0
2.2351 8.0 816 1.7409 14.4054 6.301 12.1657 13.3541 19.0
2.2351 9.0 918 1.7266 14.523 6.4309 12.2507 13.4937 19.0
1.9331 10.0 1020 1.7176 14.6255 6.5518 12.2987 13.5785 19.0
1.9331 11.0 1122 1.7080 14.7579 6.5473 12.3413 13.7116 19.0
1.9331 12.0 1224 1.7026 14.744 6.6321 12.3666 13.7439 19.0
1.9331 13.0 1326 1.6952 14.9263 6.7745 12.5911 13.9137 19.0
1.9331 14.0 1428 1.6924 14.9758 6.8123 12.6647 14.0481 19.0
1.8412 15.0 1530 1.6874 14.9901 6.7148 12.57 14.0264 19.0
1.8412 16.0 1632 1.6838 14.9599 6.7418 12.55 14.0427 19.0
1.8412 17.0 1734 1.6807 14.9124 6.7273 12.5752 13.9551 19.0
1.8412 18.0 1836 1.6779 14.8536 6.7331 12.5783 13.9188 19.0
1.8412 19.0 1938 1.6744 14.9394 6.7234 12.6105 13.9947 19.0
1.7905 20.0 2040 1.6736 14.9112 6.6709 12.603 13.9438 19.0
1.7905 21.0 2142 1.6724 14.9004 6.6578 12.6049 13.9428 19.0
1.7905 22.0 2244 1.6724 14.7264 6.5678 12.5109 13.8099 19.0
1.7905 23.0 2346 1.6713 14.7045 6.5554 12.4921 13.799 19.0
1.7905 24.0 2448 1.6708 14.7174 6.5556 12.4934 13.8 19.0
1.7682 25.0 2550 1.6704 14.8468 6.605 12.5912 13.8844 19.0

Framework versions

  • Transformers 4.38.2
  • Pytorch 2.1.0+cu121
  • Datasets 2.18.0
  • Tokenizers 0.15.2