Any constraint on chat template applying insturction-finetuing?

#7
by andreaKIM - opened

Hello, Thanks for such a great base model.

Since this model is not instruction-tuned model but base model, I thought there is no specific chat template used in pre-training.
However, in config files, I found some special tokens and chat template.
Are they matter when I apply SFT on this model?

Thank you for reading my question!

You can use any prompt template you'd like. If you want to use the same template that Meta used, then there is code to generate a prompt for both chat and text in https://github.com/meta-llama/llama3/blob/main/llama/tokenizer.py.

Sign up or log in to comment