AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
132 billion parameters

# 132 billion parameters

Dbrx Base
Other
A Mixture of Experts (MoE) large language model developed by Databricks, with 132 billion total parameters and 36 billion active parameters, supporting a 32K context window
Large Language Model Transformers
D
databricks
100
557
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase