B

Bonsai

Developed by deepgrove
Bonsai is a small ternary-weighted language model with 500 million parameters, built on the Llama architecture and using the Mistral tokenizer, trained on fewer than 5 billion tokens.
Downloads 113
Release Time : 3/21/2025

Model Overview

Bonsai is a small ternary-weighted language model trained by deepgrove, primarily using DCLM-Pro and Fineweb-Edu, marking a new paradigm in efficiency.

Model Features

Ternary Weight Design
Modified linear layers to support ternary weights, enhancing model efficiency.
Efficient Training
Trained on fewer than 5 billion tokens, setting a new standard for efficiency.
Compact Model
Only 500 million parameters, ideal for resource-constrained environments.

Model Capabilities

Text Generation
Language Understanding

Use Cases

Education
Knowledge Q&A
Used to answer simple knowledge-based questions, such as 'What is the capital of France?'
Research
Model Efficiency Research
Used to study the efficiency and performance of ternary-weighted models.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase