T

Tinyllama 1.1B Intermediate Step 715k 1.5T

Developed by TinyLlama
TinyLlama is a Llama model with 1.1 billion parameters, pre-trained on 3 trillion tokens, suitable for scenarios with limited computing and memory.
Downloads 4,835
Release Time : 11/4/2023

Model Overview

The TinyLlama project aims to pre-train a Llama model with 1.1 billion parameters, suitable for various application scenarios with limited computing and memory requirements.

Model Features

Compact model
It only contains 1.1 billion parameters and is suitable for various application scenarios with limited computing and memory requirements.
Compatibility
The architecture and tokenizer are exactly the same as Llama 2, and it can be plug-and-play in many open-source projects built on Llama.
Efficient training
Using 16 A100-40G GPUs, the pre-training of 3 trillion tokens was completed in 90 days.

Model Capabilities

Text generation
Language understanding

Use Cases

Natural language processing
Text generation
Generate coherent text content
The generated text has high coherence and logic
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase