Mengzi Bert Base
BERT model pre-trained on 300G Chinese corpus using MLM, POS, and SOP tasks
Downloads 438
Release Time : 3/2/2022
Model Overview
A lightweight yet powerful Chinese pre-trained language model suitable for various Chinese natural language processing tasks
Model Features
Chinese optimization
Specially optimized for Chinese language characteristics, trained on 300G Chinese corpus
Multi-task training
Joint training using three tasks: masked language modeling, part-of-speech tagging, and sentence order prediction
Lightweight and efficient
More lightweight structure compared to similar models while maintaining strong performance
Model Capabilities
Text understanding
Text completion
Semantic analysis
Sentence relationship judgment
Use Cases
Natural Language Processing
Text completion
Fill in the [MASK] positions in text
As shown in examples, can accurately understand context and complete content
Text classification
Classify Chinese texts
Excellent performance on classification tasks like TNEWS
Semantic understanding
Natural language inference
Determine logical relationships between sentences
Achieved 82.12% accuracy on CMNLI task
Featured Recommended AI Models
Š 2025AIbase