oluwatosin adewumi commited on
Commit
e87bb77
1 Parent(s): b4f4823

README updated again

Browse files
Files changed (1) hide show
  1. README.md +9 -7
README.md CHANGED
@@ -4,20 +4,21 @@ language:
4
  - en
5
  license: cc-by-4.0
6
  tags:
7
- - conversational
8
  - transformers
9
  datasets:
10
- - multi_woz_v22
11
  metrics:
12
- - perplexity
13
  widget:
14
  -
15
  ---
16
 
17
  ## T5Base-PCL
18
  This is a fine-tuned model of T5 (base) on the patronizing and condenscending language (PCL) dataset by Pérez-Almendros et al (2020) used for Task 4 competition of SemEval-2022.
19
- It is intended to be used as a classification model for identifying PCL. The dataset it's trained on is limited in scope, as it covers
20
- only some news texts covering about 20 English-speaking countries.
 
21
  The macro F1 score achieved on the test set, based on the official evaluation, is 0.5452.
22
  More information about the original pre-trained model can be found [here](https://huggingface.co/t5-base)
23
 
@@ -26,5 +27,6 @@ More information about the original pre-trained model can be found [here](https:
26
  ```python
27
  from transformers import T5ForConditionalGeneration, T5Tokenizer
28
  import torch
29
- tokenizer = AutoTokenizer.from_pretrained("tosin/pcl_22")
30
- model = AutoModelForCausalLM.from_pretrained("tosin/pcl_22")
 
 
4
  - en
5
  license: cc-by-4.0
6
  tags:
7
+ - text classification
8
  - transformers
9
  datasets:
10
+ - PCL
11
  metrics:
12
+ - F1
13
  widget:
14
  -
15
  ---
16
 
17
  ## T5Base-PCL
18
  This is a fine-tuned model of T5 (base) on the patronizing and condenscending language (PCL) dataset by Pérez-Almendros et al (2020) used for Task 4 competition of SemEval-2022.
19
+ It is intended to be used as a classification model for identifying PCL.
20
+
21
+ The dataset it's trained on is limited in scope, as it covers only some news texts covering about 20 English-speaking countries.
22
  The macro F1 score achieved on the test set, based on the official evaluation, is 0.5452.
23
  More information about the original pre-trained model can be found [here](https://huggingface.co/t5-base)
24
 
 
27
  ```python
28
  from transformers import T5ForConditionalGeneration, T5Tokenizer
29
  import torch
30
+ tokenizer = T5Tokenizer.from_pretrained("tosin/pcl_22")
31
+ model = T5ForConditionalGeneration.from_pretrained("tosin/pcl_22")
32
+ tokenizer.pad_token = tokenizer.eos_token