Svngoku commited on
Commit
848b002
1 Parent(s): c3aa0fd

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +16 -18
README.md CHANGED
@@ -110,24 +110,22 @@ This is the model card of a 🤗 transformers model that has been pushed on the
110
  ## Uses
111
 
112
  ```py
113
- # Load the model
114
- model = LlamaForCausalLM.from_pretrained('/content/kongo-llama/checkpoint-9000')
115
-
116
- # Prepare input text
117
- text = "Nzambi "
118
- inputs = wrapped_tokenizer(text, return_tensors="pt")
119
-
120
- # Generate text
121
- generated_ids = model.generate(
122
- max_length=150, # Increased length
123
- num_beams=5, # Use beam search
124
- temperature=0.7, # Adjust temperature for creativity
125
- do_sample=True,
126
- top_k=50, # Limit vocabulary for next token
127
- top_p=0.95 # Nucleus sampling
128
  )
129
 
130
- # Decode and print the generated text
131
- generated_text = wrapped_tokenizer.batch_decode(generated_ids, skip_special_tokens=True)
132
- print(generated_text)
 
 
133
  ```
 
110
  ## Uses
111
 
112
  ```py
113
+ # Use a pipeline as a high-level helper
114
+ from transformers import pipeline
115
+ pipe = pipeline("text-generation", model="Svngoku/kongo-llama")
116
+
117
+ pipe(
118
+ "Mbote, mono ",
119
+ max_length=150,
120
+ num_beams=5,
121
+ temperature=0.7,
122
+ do_sample=True,
123
+ top_p=0.95
 
 
 
 
124
  )
125
 
126
+ ```
127
+
128
+ ```sh
129
+ [{'generated_text': 'Mbote, mono na ngambu ya mpila ya bo ke monisa nde bantu yonso zole yina kaka na kati ya bo ke sadilaka yo mosi ve kana bo ke vandaka ti yo yina, to bima ya nkaka ya bo ke salaka sambu na bana ya zulu.'}]
130
+
131
  ```