maximum context length

#3
by MaziyarPanahi - opened

Thanks for sharing this model with the community!

What is the maximum input length? If it's based on Llama-2 it should be 4096 unless some scaling happened during the fine-tuning. Would you mine sharing the max context length this model can support? (looking at the config, it seems 8192 or more)

Many thanks

Maximum contextual dialog is 32K

many thanks!

MaziyarPanahi changed discussion status to closed

Sign up or log in to comment