alpaca-lora-cn-30b / README.md
licheng.li
add readme
9a2f0c6
|
raw
history blame
368 Bytes
metadata
license: gpl-3.0

A Chinese instruction-tuned LLaMA(30b)

dataset: translated alpaca instruction dataset.

Usage

Please check Alpaca to install the base project and then

python generate.py \
    --load_8bit \
    --base_model 'decapoda-research/llama-7b-hf' \
    --lora_weights 'llmatics/alpaca-lora-cn-30b'