StarCoder2 is the next generation of transparently trained open code LLMs that comes in three sizes: 3B, 7B and 15B parameters.
3b
7b
15b
548.7K Pulls Updated 3 months ago
Updated 7 months ago
7 months ago
50bffebc60cd · 11GB
model
archstarcoder2
·
parameters16B
·
quantizationQ5_K_S
11GB
params
{
"stop": [
"###",
"<|endoftext|>",
"<file_sep>"
]
}
66B
system
You are an exceptionally intelligent coding assistant that consistently delivers accurate and reliab
136B
template
{{ if .System }}{{ .System }}
{{ end }}{{ if .Prompt }}### Instruction
{{ .Prompt }}
{{ end }}###
141B
license
BigCode Open RAIL-M v1 License Agreement
Section I: Preamble
This OpenRAIL-M License Agreement was
13kB
Readme
Supporting a context window of up to 16,384 tokens, StarCoder2 is the next generation of transparently trained open code LLMs.
starcoder2:instruct
: a 15B model that follows natural and human-written instructionsstarcoder2:15b
was trained on 600+ programming languages and 4+ trillion tokens.starcoder2:7b
was trained on 17 programming languages and 3.5+ trillion tokens.starcoder2:3b
was trained on 17 programming languages and 3+ trillion tokens.
StarCoder2-15B is the best in its size class and matches 33B+ models on many evaluations. StarCoder2-3B matches the performance of StarCoder1-15B.