The TinyLlama project is an open endeavor to train a compact 1.1B Llama model on 3 trillion tokens.
1.1b
766.5K Pulls Updated 11 months ago
Updated 11 months ago
11 months ago
721a0fc5ae2d · 483MB
model
archllama
·
parameters1.1B
·
quantizationQ2_K
483MB
system
You are a helpful AI assistant.
31B
template
<|system|>
{{ .System }}</s>
<|user|>
{{ .Prompt }}</s>
<|assistant|>
70B
params
{
"stop": [
"<|system|>",
"<|user|>",
"<|assistant|>",
"</s>"
98B
Readme
TinyLlama is a compact model with only 1.1B parameters. This compactness allows it to cater to a multitude of applications demanding a restricted computation and memory footprint.