Update README.md
Browse filesNew pipeline usage
README.md
CHANGED
@@ -46,18 +46,20 @@ generate_text = pipeline(model="databricks/dolly-v2-3b", torch_dtype=torch.bfloa
|
|
46 |
You can then use the pipeline to answer instructions:
|
47 |
|
48 |
```
|
49 |
-
generate_text("Explain to me the difference between nuclear fission and fusion.")
|
|
|
50 |
```
|
51 |
|
52 |
Alternatively, if you prefer to not use `trust_remote_code=True` you can download [instruct_pipeline.py](https://huggingface.co/databricks/dolly-v2-3b/blob/main/instruct_pipeline.py),
|
53 |
store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
|
54 |
|
55 |
```
|
|
|
56 |
from instruct_pipeline import InstructionTextGenerationPipeline
|
57 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
58 |
|
59 |
tokenizer = AutoTokenizer.from_pretrained("databricks/dolly-v2-3b", padding_side="left")
|
60 |
-
model = AutoModelForCausalLM.from_pretrained("databricks/dolly-v2-3b", device_map="auto")
|
61 |
|
62 |
generate_text = InstructionTextGenerationPipeline(model=model, tokenizer=tokenizer)
|
63 |
```
|
|
|
46 |
You can then use the pipeline to answer instructions:
|
47 |
|
48 |
```
|
49 |
+
res = generate_text("Explain to me the difference between nuclear fission and fusion.")
|
50 |
+
print(res[0]["generated_text"])
|
51 |
```
|
52 |
|
53 |
Alternatively, if you prefer to not use `trust_remote_code=True` you can download [instruct_pipeline.py](https://huggingface.co/databricks/dolly-v2-3b/blob/main/instruct_pipeline.py),
|
54 |
store it alongside your notebook, and construct the pipeline yourself from the loaded model and tokenizer:
|
55 |
|
56 |
```
|
57 |
+
import torch
|
58 |
from instruct_pipeline import InstructionTextGenerationPipeline
|
59 |
from transformers import AutoModelForCausalLM, AutoTokenizer
|
60 |
|
61 |
tokenizer = AutoTokenizer.from_pretrained("databricks/dolly-v2-3b", padding_side="left")
|
62 |
+
model = AutoModelForCausalLM.from_pretrained("databricks/dolly-v2-3b", device_map="auto", torch_dtype=torch.bfloat16)
|
63 |
|
64 |
generate_text = InstructionTextGenerationPipeline(model=model, tokenizer=tokenizer)
|
65 |
```
|