# Code Generation Enhancement

Motif 2.6B
Other
Motif 2.6B is a language model with 2.6 billion parameters, trained from scratch on AMD Instinct™ MI250 GPUs, aiming to build AI that aligns with human values, is useful, and reliable.
Large Language Model Safetensors Supports Multiple Languages
M
Motif-Technologies
1,470
29
Falcon H1 0.5B Base
Other
Falcon-H1 is a decoder-only causal model with a hybrid Transformers + Mamba architecture developed by TII, focusing on English NLP tasks with excellent performance.
Large Language Model Transformers
F
tiiuae
485
10
Mimo 7B SFT
MIT
MiMo-7B-RL is a reinforcement learning model trained based on the MiMo-7B-SFT model, achieving performance comparable to OpenAI o1-mini in mathematical and code reasoning tasks.
Large Language Model Transformers
M
XiaomiMiMo
1,183
23
Qwen2.5 7B Fuse Exp
This is a language model merged using the mergekit tool through the SCE method, combining multiple 7B-parameter scale models
Large Language Model Transformers
Q
bunnycore
22
2
Gemma 3 27b It Codeforces SFT
This model is a fine-tuned version of google/gemma-3-27b-it on the open-r1/codeforces-cots dataset, primarily used for code generation and programming-related tasks.
Large Language Model Transformers
G
qgallouedec
14
4
Yi 1.5 34B Chat
Apache-2.0
Yi-1.5 is an upgraded version of the Yi model, demonstrating superior performance in programming, mathematics, reasoning, and instruction-following capabilities while maintaining excellent language understanding, commonsense reasoning, and reading comprehension abilities.
Large Language Model Transformers
Y
01-ai
70.62k
270
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase