# Efficient Code Generation
Viper Coder V1.7 Vsm6
Apache-2.0
Viper-Coder-v1.7-Vsm6 is a large language model based on the Qwen2.5 14B modal architecture, focusing on improving coding efficiency and computational reasoning capabilities, optimizing memory usage, and reducing redundant text generation.
Large Language Model
Transformers Supports Multiple Languages

V
prithivMLmods
491
5
Notbad V1 0 Mistral 24b
Apache-2.0
Notbad v1.0 Mistral 24B is a model focused on mathematical and Python programming reasoning, based on Mistral-Small-24B-Instruct-2501 and further trained with reinforcement learning.
Large Language Model
Transformers

N
notbadai
29
5
Sombrero Opus 14B Sm5
Apache-2.0
Designed based on Qwen 2.5 14B modal architecture, enhancing coding efficiency and computational reasoning capabilities
Large Language Model
Transformers Supports Multiple Languages

S
prithivMLmods
43
2
Featured Recommended AI Models