# Long-context Programming
Deepcoder 1.5B Preview AWQ
MIT
DeepCoder-1.5B-Preview is a large language model for code reasoning, fine-tuned from DeepSeek-R1-Distilled-Qwen-1.5B through distributed reinforcement learning, capable of handling longer context lengths.
Large Language Model
Transformers English

D
adriabama06
72
2
Sombrero Opus 14B Sm5
Apache-2.0
Designed based on Qwen 2.5 14B modal architecture, enhancing coding efficiency and computational reasoning capabilities
Large Language Model
Transformers Supports Multiple Languages

S
prithivMLmods
43
2
Featured Recommended AI Models