Tinyllama 15M
A 15-million-parameter Llama 2 architecture model trained on the TinyStories dataset
Downloads 3,217
Release Time : 9/16/2023
Model Overview
This is a lightweight language model based on the Llama 2 architecture, specifically optimized for small tasks and story generation.
Model Features
Lightweight design
Only 15 million parameters, suitable for resource-constrained environments
Llama 2 architecture
Based on Meta's Llama 2 architecture with efficient inference performance
Story generation optimization
Specifically trained on the TinyStories dataset, excelling at generating coherent short stories
Model Capabilities
Text generation
Story creation
Short text continuation
Use Cases
Education
Children's story generation
Generating simple short stories for children's educational applications
Produces coherent short stories suitable for children's reading
Entertainment
Creative writing assistance
Providing inspiration and content continuation for creative writing
Helps writers overcome writer's block and provides creative inspiration
Featured Recommended AI Models