T

Tinyllama 1.1B Chat V0.4 GGUF

Developed by afrideva
TinyLlama-1.1B is a compact large language model with 1.1 billion parameters, based on the Llama 2 architecture, optimized for computation and memory-constrained scenarios.
Downloads 65
Release Time : 11/16/2023

Model Overview

This is a fine-tuned chat model based on TinyLlama-1.1B, pre-trained on 3 trillion tokens and fine-tuned using the OpenAssistant dataset, following the chatml format.

Model Features

Compact and efficient
A small model with only 1.1B parameters, suitable for resource-constrained environments
Fast training
Pre-trained on 3 trillion tokens in 90 days using 16 A100-40G GPUs
Strong compatibility
Adopts the same architecture and tokenizer as Llama 2, plug-and-play compatible with Llama ecosystem projects
Chat-optimized
Fine-tuned with the OpenAssistant dataset, optimized for conversational scenarios

Model Capabilities

Text generation
Conversational interaction
English comprehension and generation

Use Cases

Chat applications
Smart assistant
Building lightweight conversational assistants
Capable of generating coherent and relevant dialogue responses
Education
Learning tutor
Helping students answer learning-related questions
Providing reasonable answers to education-related questions
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase