AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Mixture of Experts (MoE)

# Mixture of Experts (MoE)

Deepseek V3 0324 GGUF UD
MIT
DeepSeek-V3-0324 is a dynamically quantized version provided by Unsloth, supporting inference frameworks like llama.cpp and LMStudio.
Large Language Model English
D
unsloth
6,270
6
Jamba V0.1
Apache-2.0
Jamba is a state-of-the-art hybrid SSM-Transformer large language model that combines the advantages of Mamba architecture with Transformer, supporting 256K context length, surpassing models of similar scale in throughput and performance.
Large Language Model Transformers
J
ai21labs
6,247
1,181
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase