File size: 1,046 Bytes
b7cfdbc
 
 
e092789
 
 
 
 
 
 
 
 
 
 
 
f29dbb5
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
---
base_model: long-t5-tglobal-base
---
# Fine-tuned LongT5 for Conversational QA

This model is a fine-tuned version of [long-t5-tglobal-base](https://huggingface.co/google/long-t5-tglobal-base) for the task of Conversational QA. The model was fine-tuned on the [SQuADv2](https://huggingface.co/datasets/squad_v2) and [CoQA](https://huggingface.co/datasets/coqa) datasets and on Tryolabs' own custom dataset, [TryoCoQA](https://github.com/tryolabs/TryoCoQA).

An export of this model to the ONNX format is available at [tryolabs/long-t5-tglobal-base-blogpost-cqa-onnx](https://huggingface.co/tryolabs/long-t5-tglobal-base-blogpost-cqa-onnx).

You can find the details on how we fine-tuned the model and built TryoCoQA on our blog post!

You can also play with the model on the following [space](https://huggingface.co/spaces/tryolabs/blogpost-cqa).

## Results

* Fine-tuning for 3 epochs on SQuADv2 and CoQA combined achieved a 74.29 F1 score on the test set.
* Fine-tuning for 166 epochs on TryoCoQA achieved a 54.77 F1 score on the test set.