Tiny LLM
This is a miniature large language model with only 10 million parameters, possibly one of the smallest functional LLMs currently available.
Downloads 101.14k
Release Time : 11/3/2024
Model Overview
Tiny-LLM is a lightweight large language model primarily designed for text generation tasks. It was trained on 32 billion tokens from the Fineweb dataset and supports a context length of 1024 tokens.
Model Features
Lightweight
Only 10 million parameters, making it one of the smallest functional LLMs available
Efficient training
Trained on 32 billion tokens from the Fineweb dataset
Moderate context length
Supports a context length of 1024 tokens
Model Capabilities
Text generation
Use Cases
Text generation
Creative writing
Generate short stories or creative texts
Q&A systems
Answer simple questions or provide information
Featured Recommended AI Models
Š 2025AIbase