File size: 4,841 Bytes
c687f7c
 
 
 
 
 
 
cf1a686
 
 
 
c687f7c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
6466273
c687f7c
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
ad70cd2
 
 
 
 
 
 
 
 
 
 
 
bb75d42
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
 
c687f7c
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
21
22
23
24
25
26
27
28
29
30
31
32
33
34
35
36
37
38
39
40
41
42
43
44
45
46
47
48
49
50
51
52
53
54
55
56
57
58
59
60
61
62
63
64
65
66
67
68
69
70
71
72
73
74
75
76
77
78
79
80
81
82
83
84
85
86
87
88
89
90
91
92
93
94
95
96
97
98
99
100
101
102
103
104
105
106
107
108
109
110
111
112
113
114
115
116
117
118
119
120
121
122
123
124
125
126
127
128
129
130
131
132
133
134
135
136
137
138
139
140
141
142
143
144
145
146
147
148
149
150
151
152
153
154
155
156
157
158
159
160
---
license: apache-2.0
---
# Lite-Mistral-150M-v2-Instruct

This is a Lite series model based on the Mistral architecture, comprising approximately 157 million parameters.

<a href="https://huggingface.co/OuteAI/Lite-Mistral-150M-v2-Instruct">Lite-Mistral-150M-v2-Instruct</a> <br>
<a href="https://huggingface.co/OuteAI/Lite-Mistral-150M-v2-Instruct-GGUF">Lite-Mistral-150M-v2-Instruct-GGUF</a> <br>


## Improvements from v1:
- Extended Training: Further refinement of the model, resulting in improved benchmark performance and overall text generation quality.
- Tokenizer changes.

## How coherent is the 150M model?
Let's look at real-world examples:

## Example 1
```
Input:
<s>user
What is the capital of France?</s> 
<s>assistant
The capital of France is Paris.</s> 
<s>user
What is the capital of Spain?</s>

Response:
<s>assistant
The capital of Spain is Madrid, which is known for its beautiful architecture and rich history.</s>
```

## Example 2
```
Input:
<s>user
Who wrote the play "Romeo and Juliet"?</s>

Response:
<s>assistant
The play "Romeo and Juliet" was written by William Shakespeare.</s>
```

## Example 3
```
Input:
<s>user
Tell me a joke</s>

Response:
<s>assistant
Why did the chicken cross the playground? Because it wanted to see who could get the ball!</s>
```

## Example 4 (RP)
```
Input:
<s>user
Let's roleplay. I'll be a customer at a restaurant, and you be the waiter. Ready?</s> 
<s>assistant
Certainly! I'll play the role of a waiter. Please go ahead and start the roleplay when you're ready.</s> 
<s>user
Hello, I'd like to see the menu please.</s> 
<s>assistant
Good evening, sir! Welcome to our restaurant. Here's our menu for today. May I recommend our chef's special, the grilled salmon with lemon butter sauce? Let me know if you have any questions about our dishes.</s> 
<s>user
I'll have the salmon. What wines do you recommend to go with it?</s> 

Response:
<s> assistant
To go with your salmon, we suggest a pair of red-tailed or white-tailed jackfights. The grilled salmon is creamy and has a nice acidity. The lemon butter sauce is a great addition to this dish.</s>
```

While the model performs well for straightforward tasks, it may lack depth in complex topics or sophisticated creative abilities.
It's optimized for scenarios where computational efficiency is paramount.
As a lightweight model, it may exhibit reduced accuracy and coherence compared to larger language models.

## Benchmarks:

<table style="text-align: left;">
  <tr>
    <th>Benchmark</th>
    <th>5-shot</th>
    <th>0-shot</th>
  </tr>
  <tr>
    <td>ARC Easy</td>
    <td>47.26</td>
    <td>45.58</td>
  </tr>
  <tr>
    <td>BoolQ</td>
    <td>43.33</td>
    <td>44.16</td>
  </tr>
  <tr>
    <td>HellaSWAG</td>
    <td>28.70</td>
    <td>28.72</td>
  </tr>
  <tr>
    <td>MMLU</td>
    <td>26.09</td>
    <td>25.28</td>
  </tr>
  <tr>
    <td>OpenBookQA</td>
    <td>16.00</td>
    <td>18.20</td>
  </tr>
  <tr>
    <td>PIQA</td>
    <td>62.79</td>
    <td>62.02</td>
  </tr>
  <tr>
    <td>Winogrande</td>
    <td>51.30</td>
    <td>51.78</td>
  </tr>
</table>

## Chat format

This model uses a specific chat format for optimal performance.
```
<s>system
[System message]</s>
<s>user
[Your question or message]</s> 
<s>assistant
[The model's response]</s>
```

## Usage with HuggingFace transformers 
The model can be used with HuggingFace's `transformers` library:
```python
from transformers import AutoModelForCausalLM, AutoTokenizer

model = AutoModelForCausalLM.from_pretrained("OuteAI/Lite-Mistral-150M-v2-Instruct")
tokenizer = AutoTokenizer.from_pretrained("OuteAI/Lite-Mistral-150M-v2-Instruct")

def generate_response(message):
    # Encode the formatted message as input ids
    input_ids = tokenizer.encode(f"<s>user\n{message}</s>", return_tensors="pt")
    output = model.generate(input_ids, max_length=100, pad_token_id=tokenizer.eos_token_id)

    # Decode the generated output
    generated_text = tokenizer.decode(output[0], skip_special_tokens=True)
    
    return generated_text

message = "What is the capital of Spain?"
response = generate_response(message)
```

## Risk Disclaimer

By using this model, you acknowledge that you understand and assume the risks associated with its use. You are solely responsible for ensuring compliance with all applicable laws and regulations. We disclaim any liability for problems arising from the use of this open-source model, including but not limited to direct, indirect, incidental, consequential, or punitive damages. We make no warranties, express or implied, regarding the model's performance, accuracy, or fitness for a particular purpose. Your use of this model is at your own risk, and you agree to hold harmless and indemnify us, our affiliates, and our contributors from any claims, damages, or expenses arising from your use of the model.