Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
132 billion parameters
# 132 billion parameters
Dbrx Base
Other
A Mixture of Experts (MoE) large language model developed by Databricks, with 132 billion total parameters and 36 billion active parameters, supporting a 32K context window
Large Language Model
Transformers
D
databricks
100
557
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase