File size: 368 Bytes
75eb7ee
 
 
9a2f0c6
 
 
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
---
license: gpl-3.0
---

A Chinese instruction-tuned LLaMA(30b)

dataset: translated alpaca instruction dataset.

# Usage

Please check [Alpaca](https://github.com/tloen/alpaca-lora) to install the base project and then

```
python generate.py \
    --load_8bit \
    --base_model 'decapoda-research/llama-7b-hf' \
    --lora_weights 'llmatics/alpaca-lora-cn-30b'
```