Afrizal Hasbi Azizy commited on
Commit
2b4cb5e
1 Parent(s): a39e3c8

Update README.md

Browse files
Files changed (1) hide show
  1. README.md +6 -7
README.md CHANGED
@@ -13,7 +13,7 @@ language:
13
  <center>
14
  <img src="https://imgur.com/9nG5J1T.png" alt="Kancil" width="600" height="300">
15
  <p><em>Kancil is a fine-tuned version of Llama 3 8B using synthetic QA dataset generated with Llama 3 70B. Version zero of Kancil is the first generative Indonesian LLM gain functional instruction performance using solely synthetic data.</em></p>
16
- <p><em><a href="https://colab.research.google.com/drive/1526QJYfk32X1CqYKX7IA_FFcIHLXbOkx?usp=sharing" style="color: blue;">Go straight to the colab demo</a></em></p>
17
  </center>
18
 
19
  ### Introducing the Kancil family of open models
@@ -88,14 +88,13 @@ pass
88
  FastLanguageModel.for_inference(model)
89
  inputs = tokenizer(
90
  [
91
- prompt_template.format(
92
- prompt="Apa itu generative AI?",
93
- response="",
94
- )
95
  ], return_tensors = "pt").to("cuda")
96
 
97
- outputs = model.generate(**inputs, max_new_tokens = 128, temperature=.8, use_cache = True)
98
- print(tokenizer.batch_decode(outputs)[0])
99
  ```
100
 
101
  **Note:** There was an issue with the dataset such that newline characters are printed as string literals. Sorry about that!
 
13
  <center>
14
  <img src="https://imgur.com/9nG5J1T.png" alt="Kancil" width="600" height="300">
15
  <p><em>Kancil is a fine-tuned version of Llama 3 8B using synthetic QA dataset generated with Llama 3 70B. Version zero of Kancil is the first generative Indonesian LLM gain functional instruction performance using solely synthetic data.</em></p>
16
+ <p><em><a href="https://colab.research.google.com/drive/1526QJYfk32X1CqYKX7IA_FFcIHLXbOkx?usp=sharing" style="color: blue;">Go straight to the colab demo</a></em></p>
17
  </center>
18
 
19
  ### Introducing the Kancil family of open models
 
88
  FastLanguageModel.for_inference(model)
89
  inputs = tokenizer(
90
  [
91
+ prompt_template.format(
92
+ prompt="Bagaimana canting dan malam digunakan untuk menggambar pola batik?",
93
+ response="",)
 
94
  ], return_tensors = "pt").to("cuda")
95
 
96
+ outputs = model.generate(**inputs, max_new_tokens = 600, temperature=.8, use_cache = True)
97
+ print(tokenizer.batch_decode(outputs)[0].replace('\\n', '\n'))
98
  ```
99
 
100
  **Note:** There was an issue with the dataset such that newline characters are printed as string literals. Sorry about that!