T

Tinyllama 1.1B Step 50K 105b

Developed by TinyLlama
TinyLlama is a 1.1B parameter Llama model, planned to be pretrained on 3 trillion tokens, optimized to complete training in 90 days on 16 A100-40G GPUs.
Downloads 14.41k
Release Time : 9/1/2023

Model Overview

The TinyLlama project aims to pretrain a compact 1.1B parameter Llama model, compatible with the Llama 2 architecture and tokenizer, suitable for applications with limited computational and memory resources.

Model Features

Efficient Training
Through optimization, only 16 A100-40G GPUs are required to complete pretraining on 3 trillion tokens in 90 days.
Compatibility
Fully adopts the same architecture and tokenizer as Llama 2, compatible with most Llama-based open-source projects.
Compactness
Contains only 1.1B parameters, suitable for applications with limited computational and memory resources.

Model Capabilities

Text Generation

Use Cases

Natural Language Processing
Text Generation
Generate coherent text content
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase