Stable LM 2 is a state-of-the-art 1.6B and 12B parameter language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.
1.6b
12b
101.2K Pulls Updated 7 months ago
Updated 7 months ago
7 months ago
e24492cdd1f0 · 24GB
model
archstablelm
·
parameters12.1B
·
quantizationF32
24GB
license
STABILITY AI NON-COMMERCIAL RESEARCH COMMUNITY LICENSE AGREEMENT
Dated: December 06, 2023
By using
7.4kB
Readme
Stable LM 2 1.6B is a state-of-the-art 1.6 and 12B billion parameter small language model trained on multilingual data in English, Spanish, German, Italian, French, Portuguese, and Dutch.
The model is trained on a mix of publicly available datasets and synthetic datasets, utilizing Direct Preference Optimization (DPO).