M

Minillm 0.2B WithWudao

Developed by Tongjilibo
MiniLLM is a lightweight Chinese language model developed based on the bert4torch framework, covering the entire process from pre-training to instruction fine-tuning, with basic dialogue capabilities
Downloads 127
Release Time : 4/25/2025

Model Overview

A lightweight Chinese language model with a 12-layer Transformer structure, pre-trained and fine-tuned with Alpaca instructions, suitable for basic dialogue scenarios

Model Features

Full-process Training
Complete implementation of the entire process: pre-training → instruction fine-tuning → reward modeling → reinforcement learning
Lightweight and Efficient
12-layer Transformer structure, fine-tuning possible with just 17GB of GPU memory
Framework Compatibility
Training uses the bert4torch framework, while inference is compatible with transformers

Model Capabilities

Chinese text generation
Basic dialogue interaction
Instruction following

Use Cases

Intelligent Assistant
Daily Q&A
Answering everyday life-related questions
Generates concise responses based on common sense
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase