M

Minillm 0.2B Base

Developed by Tongjilibo
MiniLLM is a lightweight language model project that fully implements the entire process from pre-training → instruction fine-tuning → reward modeling → reinforcement learning, economically and efficiently building a chat model with basic conversational capabilities
Downloads 41
Release Time : 3/16/2024

Model Overview

This project is dedicated to creating a lightweight language model using the bert4torch training framework, with concise and efficient code. The trained model can directly integrate with the transformers inference ecosystem. The current experimental model only has basic conversational functionality.

Model Features

Lightweight and efficient
Uses the bert4torch training framework with concise and efficient code, optimizing GPU memory usage during training
Strong compatibility
The trained model can directly integrate with the transformers inference ecosystem
Full-process implementation
Complete implementation of the entire process from pre-training → instruction fine-tuning → reward modeling → reinforcement learning

Model Capabilities

Chinese text generation
Basic conversation
Text continuation

Use Cases

Education
Learning assistant
Helps students answer basic learning questions
Can generate explanations and examples of basic learning content
Entertainment
Simple chat
Engages in daily conversation
Capable of basic greetings and simple topic discussions
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase