AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
1-bit quantization

# 1-bit quantization

Bitnet B1 58 3B
MIT
BitNet b1.58 is a 1.58-bit quantized large language model that achieves efficient inference by quantizing weights to ternary values {-1, 0, 1}. The model reproduces the original paper's results and was trained on 100 billion tokens from the RedPajama dataset.
Large Language Model Transformers
B
1bitLLM
1,109
249
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase