B

Biggie SmoLlm 0.15B Base

Developed by nisten
An upgraded version of the SmolLM-135M micro language model with 0.18B parameters, suitable for training scenarios, featuring excellent inference speed and coherence performance
Downloads 944
Release Time : 7/29/2024

Model Overview

This is a language model built through semi-automated continuous fusion technology, offering better coherence and serving as an excellent choice for advanced training. The model incorporates multiple cutting-edge technologies such as evolutionary model fusion, BitNet integration, and the experimental GrokAdamW optimizer.

Model Features

Efficient inference
Achieves 160 tokens/second inference speed on a single CPU core without GPU support
Advanced optimization techniques
Incorporates multiple cutting-edge technologies such as evolutionary model fusion, BitNet integration, and the experimental GrokAdamW optimizer
Lightweight
Quantized model size is only 164MB, suitable for deployment in resource-constrained environments
Coherence performance
Maintains coherence even in the first 100 tokens under default temperature parameters

Model Capabilities

Text generation
Instruction understanding
Technical Q&A

Use Cases

Research applications
Scientific Q&A
Answering technical questions posed by NASA JPL scientists
Capable of generating coherent responses aligned with scientific contexts
Education
Teaching assistance
Generating explanations and examples of technical concepts
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase