Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Open-Source MoE
# Open-Source MoE
Openmoe Base
Apache-2.0
OpenMoE-Base is a Mixture of Experts (MoE) base model for debugging purposes, trained on only 128 billion tokens. As part of the OpenMoE project, it aims to advance the open-source MoE community.
Large Language Model
Transformers
O
OrionZheng
73
5
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase