T

Tinyllama 1.1B Chat V0.6

Developed by TinyLlama
TinyLlama is a 1.1 billion parameter Llama model pre-trained on 3 trillion tokens, suitable for scenarios with limited computation and memory resources.
Downloads 11.60k
Release Time : 11/20/2023

Model Overview

TinyLlama is a compact large language model that fully replicates the architecture and tokenizer of Llama 2, compatible with numerous Llama-based open-source projects.

Model Features

Efficient training
Pre-trained on 3 trillion tokens in just 90 days using only 16 A100-40G GPUs.
Compatibility
Fully replicates the Llama 2 architecture and tokenizer, compatible with Llama-based open-source projects.
Fine-tuning approach
Adopts HuggingFace Zephyr's training scheme, including UltraChat fine-tuning and DPO alignment.

Model Capabilities

Text generation
Dialogue systems

Use Cases

Chatbots
Stylized chatbots
Can customize chatbots with different styles, such as pirate-themed.
Generates dialogue content that matches the set style.
Resource-constrained applications
Edge device deployment
Suitable for running on devices with limited computation and memory resources.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase