Update README.md
Browse files
README.md
CHANGED
@@ -34,8 +34,8 @@ About this version[V0.4]:
|
|
34 |
* We provide a system prompt[Files and Versions --> chat_template]. The SLM was partly trained using that template, so the output is better if you use the prompt at start.
|
35 |
* AURORA expects the chat template to be Vicuna[{{user}}: {some input}\nAURORA: {some output}\n{{user}}]. The model will only work correctly with this format.
|
36 |
* Recommended temperature is from 0.3 to 0.5.
|
37 |
-
* Improved
|
38 |
-
*
|
39 |
|
40 |
All in one, AURORA's aim is to provide a digital friend, which is also accessible to humans with low-end devices.
|
41 |
|
|
|
34 |
* We provide a system prompt[Files and Versions --> chat_template]. The SLM was partly trained using that template, so the output is better if you use the prompt at start.
|
35 |
* AURORA expects the chat template to be Vicuna[{{user}}: {some input}\nAURORA: {some output}\n{{user}}]. The model will only work correctly with this format.
|
36 |
* Recommended temperature is from 0.3 to 0.5.
|
37 |
+
* Improved chat quality in general chat, roleplaying, etc.
|
38 |
+
* Math and other factual stuff often spits out false "facts", since the model was trained on in-context-learning. It is able to understand some complex words and use them correctly.
|
39 |
|
40 |
All in one, AURORA's aim is to provide a digital friend, which is also accessible to humans with low-end devices.
|
41 |
|