CodeQwen1.5 is a large language model pretrained on a large amount of code data.
7b
120.9K Pulls Updated 6 months ago
Updated 8 months ago
8 months ago
18f779849b86 · 15GB
model
archqwen2
·
parameters7.25B
·
quantizationF16
15GB
license
Tongyi Qianwen LICENSE AGREEMENT
Tongyi Qianwen Release Date: August 3, 2023
By clicking to agree
6.9kB
Readme
CodeQwen1.5 is based on Qwen1.5. It is trained on 3 trillion tokens of code data. Its major features include:
- Strong code generation capabilities and competitive performance across a series of benchmarks
- Support for long context understanding and generation with a maximum context length of 64K tokens
- Support for 92 coding languages
- Excellent performance in Text-to-SQL, fixing bugs and other coding use cases.