Rallio67 commited on
Commit
c632af9
1 Parent(s): 52d630a

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +11 -2
README.md CHANGED
@@ -1,10 +1,15 @@
1
  ## Overview
2
  This is a LoRA adapter for google/flan-ul2 available on huggingface. It takes as input a text document and outputs a synopsis and document classifier tags.
3
 
 
 
4
  ```
5
  import torch
6
  from peft import PeftModel, PeftConfig
7
  from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
 
 
 
8
 
9
  # Load device map for FLAN_UL2
10
  device_map = {
@@ -45,7 +50,7 @@ def generate_condlabels(input_text):
45
  penalty_alpha=0.6,
46
  top_k=5,
47
  bos_token_id=0,
48
- eos_token_id=2,
49
  repetition_penalty=1.0,
50
  return_dict_in_generate=True,
51
  output_scores=True,
@@ -88,5 +93,9 @@ We have developed a fine tuned LoRA model based on the open source FLAN-UL2 that
88
  """
89
 
90
  # Generate outputs for a list of strings
91
- generate_condlabels([text])
 
 
 
 
92
  ```
 
1
  ## Overview
2
  This is a LoRA adapter for google/flan-ul2 available on huggingface. It takes as input a text document and outputs a synopsis and document classifier tags.
3
 
4
+ You can use this to convert your training data into conditional pretraining examples.
5
+
6
  ```
7
  import torch
8
  from peft import PeftModel, PeftConfig
9
  from transformers import AutoModelForSeq2SeqLM, AutoTokenizer
10
+ import math
11
+
12
+ device_id=0
13
 
14
  # Load device map for FLAN_UL2
15
  device_map = {
 
50
  penalty_alpha=0.6,
51
  top_k=5,
52
  bos_token_id=0,
53
+ eos_token_id=1,
54
  repetition_penalty=1.0,
55
  return_dict_in_generate=True,
56
  output_scores=True,
 
93
  """
94
 
95
  # Generate outputs for a list of strings
96
+ output=generate_condlabels([text])
97
+ print(output[0][0])
98
+
99
+ """<pad> Synopsis: The document outlines the conditional pretraining of large language models and provides information about the ChatGPT project. Tags: [ human language understanding, conditional pretraining, chatbots, machine learning]</s>"""
100
+
101
  ```