AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Trillion-level parameters

# Trillion-level parameters

Olmo 2 0325 32B
Apache-2.0
OLMo 2 32B is the largest 32B-parameter model in the open language model series released by the Allen Institute for AI (AI2). It is open-sourced under the Apache 2.0 license and supports English language processing.
Large Language Model Transformers English
O
allenai
2,246
47
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase