M

Mindllm 1b3 Chat Zh V2.0

Developed by bit-dny
MindLLM 1.3B is a 1.3 billion-parameter Transformer model jointly developed by the Beijing Engineering Research Center of Massive Language Information Processing and Cloud Computing Applications and the Southeast Institute of Information Technology, Beijing Institute of Technology, supporting Chinese and English dialogue generation.
Downloads 122
Release Time : 12/17/2023

Model Overview

A lightweight large language model trained on bilingual datasets such as Pile, WuDao, and CBook, as well as safety education web data, excelling in commonsense reasoning and language understanding.

Model Features

Lightweight Design
Achieves performance surpassing some 13 billion-parameter models with only 1.3 billion parameters.
Bilingual Support
Supports both Chinese and English dialogue generation tasks.
Domain Adaptability
Trained with vertical domain data such as safety education, suitable for professional scenarios.

Model Capabilities

Text Generation
Multi-turn Dialogue
Commonsense Reasoning
Language Understanding

Use Cases

Intelligent Assistant
Q&A System
Answers users' commonsense questions such as the advantages of electric vehicles.
Generates structured advantage lists (zero emissions, low maintenance costs, etc.).
Educational Applications
Safety Education
Conducts knowledge Q&A based on safety education content in the training data.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase