GGUF
starcoder2
File size: 1,086 Bytes
87b492e
 
 
 
 
 
 
a556ad5
 
 
 
 
 
 
 
 
 
 
 
 
1
2
3
4
5
6
7
8
9
10
11
12
13
14
15
16
17
18
19
20
---
license: bigcode-openrail-m
datasets:
- bigcode/the-stack-v2-train-full-ids
tags:
- starcoder2
---
StarCoder2-15B model is a 15B parameter model trained on 600+ programming languages from The Stack v2, with opt-out requests excluded. The model uses Grouped Query Attention, a context window of 16,384 tokens with a sliding window attention of 4,096 tokens, and was trained using the Fill-in-the-Middle objective on 4+ trillion tokens.
The model was trained with NVIDIA NeMo™ Framework using the NVIDIA Eos Supercomputer built with NVIDIA DGX H100 systems.

Project Website: bigcode-project.org
Paper: Link
Point of Contact: contact@bigcode-project.org
Languages: 600+ Programming languages
Use
Intended use
The model was trained on GitHub code as well as additional selected data sources such as Arxiv and Wikipedia. As such it is not an instruction model and commands like "Write a function that computes the square root." do not work well.

Generation
Here are some examples to get started with the model. You can find a script for fine-tuning in StarCoder2's GitHub repository.