Zephyr is a series of fine-tuned versions of the Mistral and Mixtral models that are trained to act as helpful assistants.
7b
141b
228.1K Pulls Updated 8 months ago
Updated 8 months ago
8 months ago
d5c73d340d28 · 52GB
model
archllama
·
parameters141B
·
quantizationQ2_K
52GB
params
{
"stop": [
"<|system|>",
"<|user|>",
"<|assistant|>",
"</s>"
98B
template
{{ if .System }}<|system|>
{{ .System }}</s>
{{ end }}{{ if .Prompt }}<|user|>
{{ .Prompt }}</s>
{{
139B
license
Apache License
Version 2.0, January 2004
11kB
Readme
Zephyr is a series of language models that are trained to act as helpful assistants. Zephyr 141B-A35B is the latest model in the series, and is a fine-tuned version of Mixtral 8x22b.
Sizes
zephyr:141b
: A Mixture of Experts (MoE) model with 141B total parameters and 35B active parameters.zephyr:7b
: The original Zephyr model