Rbt3
This is a Chinese pre-trained BERT model employing whole word masking technology, developed by the HIT-iFLYTEK Joint Lab to accelerate advancements in Chinese natural language processing.
Downloads 6,626
Release Time : 3/2/2022
Model Overview
The model is retrained based on the RoBERTa-wwm-ext architecture, focusing on Chinese text processing and supporting fill-mask tasks.
Model Features
Whole Word Masking
Uses whole word masking instead of character-level masking, better aligning with Chinese linguistic characteristics and enhancing model comprehension.
Chinese Optimization
Specially optimized for Chinese text, delivering excellent performance in Chinese NLP tasks.
Lightweight Architecture
Adopts a three-layer structure, making it more lightweight and efficient compared to the full BERT model.
Model Capabilities
Chinese Text Understanding
Fill-Mask Prediction
Contextual Semantic Analysis
Use Cases
Natural Language Processing
Text Fill-in-the-Blank
Predicts words masked in the text
Accurately predicts words that fit the contextual semantics
Text Classification
Performs classification tasks on Chinese text
Demonstrates strong performance in various Chinese text classification tasks
Featured Recommended AI Models
Š 2025AIbase