Issue with chat template in Saul llm

#9
by Damith - opened

Hi,

I was trying to inference using this model and it seems the model works only for the llama2 jinja template and the results seem to be comparatively low. I tried mistral template, the model goes horribly wrong and when I remove the chat template completely, it gives me issues saying,

"jinja2.exceptions.TemplateError: Conversation roles must alternate user/assistant/user/assistant/..."

It seems like I cannot pass a system prompt in this case. my prompts order is; system and then user thats it.

Following is my code :

image.png

//some function

image.png

Can you suggest anything to get better running settings ?

Sign up or log in to comment