Any plans to release quantized model?

#15
by asusevski - opened

I'd love to use a quantized version for local stuff, any chance we could get this released?

You can try out this INT8 OpenVINO version here: https://huggingface.co/sandeshb/llama-3-sqlcoder-8b-int8-ov
Fair warning: I've played around with this quantized model for a couple of days, and it looks like there are some performance issues, particularly with negative answers (where answer should be "I don't know"). The model always returns an SQL query even if it isn't valid.

Using the default example on https://defog.ai/sqlcoder-demo, If I ask the question "What is the mass of the sun", then the full model returns "SELECT 'I do not know' AS answer;" whereas the quantized version returns a valid query as the answer, which will be incorrect for the question though i.e. "SELECT p.name, p.quantity FROM products p WHERE P.NAME = 'SUN'". However, I think the way to handle this is to just run the query and when the query output is null/zero, handle it accordingly. Otherwise, the quantized model works great.

Oh thanks for pointing me in the right direction! At least verifying a bad sql query can be fast depending on the size of the database, so this might work well :)

asusevski changed discussion status to closed

Sign up or log in to comment