--- license: bigcode-openrail-m pipeline_tag: text-generation library_name: gguf --- GGUF quants for https://huggingface.co/bigcode/starcoder2-15b > StarCoder2-15B model is a 15B parameter model trained on 600+ programming languages from The Stack v2, with opt-out requests excluded. The model uses Grouped Query Attention, a context window of 16,384 tokens with a sliding window attention of 4,096 tokens, and was trained using the Fill-in-the-Middle objective on 4+ trillion tokens. | Layers | Context | Template | | --- | --- | --- | |
40
|
16384
|
{context}

Code Editing Instruction: {prompt}
{response}
|