CodeQwen1.5 is the Code-Specific version of Qwen1.5. It is a transformer-based decoder-only language model pretrained on a large amount of data of codes.
Strong code generation capabilities and competitve performance across a series of benchmarks;
Supporting long context understanding and generation with the context length of 64K tokens;
Supporting 92 coding languages
Excellent performance in text-to-SQL, bug fix, etc.
三、前置条件
3.1.基础环境
操作系统:centos7 Tesla V100-SXM2-32GB CUDA Version: 12.2 3.2.下载模子