T

Tinyllama 1.1B Intermediate Step 1195k Token 2.5T

Developed by TinyLlama
TinyLlama is a compact 1.1B-parameter Llama model pretrained on 3 trillion tokens, designed for resource-constrained environments.
Downloads 419
Release Time : 12/11/2023

Model Overview

The TinyLlama project aims to pretrain a 1.1B-parameter Llama model, completed in 90 days using 16 A100-40G GPUs. The model adopts the same architecture and tokenizer as Llama 2, making it suitable for various Llama-based open-source projects.

Model Features

Efficient Training
Completed pretraining on 3 trillion tokens in 90 days using 16 A100-40G GPUs.
Compact Structure
With only 1.1B parameters, it is suitable for applications with limited computational and memory resources.
Compatibility
Adopts the same architecture and tokenizer as Llama 2, enabling plug-and-play use in Llama-based open-source projects.

Model Capabilities

Text Generation
Language Understanding

Use Cases

Natural Language Processing
Text Generation
Generate coherent English text
Language Understanding
Understand and answer English questions
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase