alpaca-lora-cn-30b / README.md
licheng.li
modify
3e84fbf
---
license: gpl-3.0
---
A Chinese instruction-tuned LLaMA(30b)
dataset: translated alpaca instruction dataset.
# Usage
Please check [Alpaca](https://github.com/tloen/alpaca-lora) to install the base project and then
```
python generate.py \
--load_8bit \
--base_model 'decapoda-research/llama-30b-hf' \
--lora_weights 'llmatics/alpaca-lora-cn-30b'
```