# Mixture of Experts (MoE)
Deepseek V3 0324 GGUF UD
MIT
DeepSeek-V3-0324 is a dynamically quantized version provided by Unsloth, supporting inference frameworks like llama.cpp and LMStudio.
Large Language Model English
D
unsloth
6,270
6
Jamba V0.1
Apache-2.0
Jamba is a state-of-the-art hybrid SSM-Transformer large language model that combines the advantages of Mamba architecture with Transformer, supporting 256K context length, surpassing models of similar scale in throughput and performance.
Large Language Model
Transformers

J
ai21labs
6,247
1,181
Featured Recommended AI Models