Edit model card

flan-t5-base-samsum

This model is a fine-tuned version of google/flan-t5-base on the samsum dataset. It achieves the following results on the evaluation set:

  • Loss: 1.3736
  • Rouge1: 47.355
  • Rouge2: 23.7601
  • Rougel: 39.8403
  • Rougelsum: 43.4718
  • Gen Len: 17.1575

Model description

More information needed

Intended uses & limitations

More information needed

Training and evaluation data

More information needed

Training procedure

Training hyperparameters

The following hyperparameters were used during training:

  • learning_rate: 5e-05
  • train_batch_size: 16
  • eval_batch_size: 16
  • seed: 42
  • optimizer: Adam with betas=(0.9,0.999) and epsilon=1e-08
  • lr_scheduler_type: linear
  • num_epochs: 5

Training results

Training Loss Epoch Step Validation Loss Rouge1 Rouge2 Rougel Rougelsum Gen Len
1.3641 1.0 921 1.3780 47.4054 23.6308 39.8273 43.3697 17.3004
1.3074 2.0 1842 1.3736 47.355 23.7601 39.8403 43.4718 17.1575
1.2592 3.0 2763 1.3740 47.2208 23.4972 39.7293 43.2546 17.2320
1.2232 4.0 3684 1.3794 47.9156 24.2451 40.2628 43.9122 17.4017
1.2042 5.0 4605 1.3780 47.8982 24.1707 40.2955 43.8939 17.3712

Framework versions

  • Transformers 4.44.2
  • Pytorch 2.4.0
  • Datasets 3.0.0
  • Tokenizers 0.19.1

How to use

from transformers import pipeline

pipe = pipeline("summarization", model="sharmax-vikas/flan-t5-base-samsum")

res = pipe('''dialogue: 
Margaret: Hi, in December I'd like to meet on 4th and 11th around 10:00 or 11:00. 
Evans: Hi, 4th - we can meet at 10:00.
Evans: And 11th - at 11:00. 
Margaret: Okey. And what about 18th?
Evans: I'm not sure about 18th. 
Evans: I might not be in town. 
Margaret: Okey, so we'll see. 
Evans: Yes. And I'll let you know next week. 
Margaret: If it's not 18th, maybe we could meet on 17th?
Evans: If I go away, I won't also be 17th.
Margaret: Okey, I get it. 
Evans: But we could meet 14th, if you like?
Margaret: Hm, I'm not sure whether I'm avaliable. 
Evans: So let's set these dates later, ok?
Margaret: Okey and we see each other 4th 10:00. 
Evans: Yes!''')

print(f"flan-t5-base summary:\n{res[0]['summary_text']}")


#output : flan-t5-base summary:
Margaret and Evans will meet on the 4th and 11th of December. They will meet at 10:00 on the 18th and at 11:00 on the 17th. If it's not 18th, they can meet on 17th or 14th.


Downloads last month
9
Safetensors
Model size
248M params
Tensor type
F32
·
Inference API
This model does not have enough activity to be deployed to Inference API (serverless) yet. Increase its social visibility and check back later, or deploy to Inference Endpoints (dedicated) instead.

Model tree for sharmax-vikas/flan-t5-base-samsum

Finetuned
this model

Dataset used to train sharmax-vikas/flan-t5-base-samsum

Evaluation results