AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
256K long context processing

# 256K long context processing

Jamba V0.1
Apache-2.0
Jamba is a state-of-the-art hybrid SSM-Transformer large language model that combines the advantages of Mamba architecture with Transformer, supporting 256K context length, surpassing models of similar scale in throughput and performance.
Large Language Model Transformers
J
ai21labs
6,247
1,181
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase