# 128K long text
Kimi K2 Instruct
Other
Kimi K2 is an advanced Mixture of Experts (MoE) language model with 32 billion active parameters and a total of 1 trillion parameters, optimized for agent capabilities.
Large Language Model
Transformers

K
moonshotai
19.33k
878
Phi 4 Mini Reasoning
MIT
Phi-4-mini-reasoning is a lightweight open-source model that focuses on high-quality, dense reasoning data and is further fine-tuned to obtain more advanced mathematical reasoning capabilities.
Large Language Model
Transformers Supports Multiple Languages

P
microsoft
18.93k
152
Phi 3.5 Vision Instruct
MIT
Phi-3.5-vision is a lightweight and advanced open-source multimodal model that supports a 128K context length and focuses on processing high-quality, inference-rich text and visual data.
Image-to-Text
Transformers Other

P
FriendliAI
370
0
Featured Recommended AI Models