context
#1
by
UNITYA
- opened
what context size is this model ?
Regular Llama-3.
Native: 8K
KCPP automatic RoPE scaling: up to 32K without much issue
You need to wait for official KCPP 1.64 or use this fork for now
https://github.com/Nexesenex/kobold.cpp/releases
Lewdiculous
changed discussion status to
closed