M

Microllama

Developed by keeeeenw
MicroLlama is a 300-million-parameter Llama model pretrained by individual developer keeeeenw within a $500 budget, focusing on English text generation tasks.
Downloads 2,955
Release Time : 3/29/2024

Model Overview

This is a miniaturized Llama model designed to prove that effective large language models can be trained with limited resources. The model is modified based on the TinyLlama project, removing code-related data to focus on general text generation.

Model Features

Low-cost training
Completed training within a $500 budget, demonstrating the feasibility of miniaturized LLMs
Fully open-source
Uses entirely open-source datasets and model architecture with no proprietary data dependencies
Lightweight
Only 300 million parameters, suitable for deployment in resource-limited environments

Model Capabilities

English text generation
Question answering systems
Language understanding

Use Cases

Education and research
Small LLM research
Serves as a case study for LLM performance in resource-constrained environments
Demonstrates that small models can achieve certain performance levels
Application development
Lightweight chatbot
Suitable for dialogue applications on mobile or edge devices
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase