Chinese Roberta Wwm Ext
A Chinese pretrained BERT model using whole word masking technology, designed to accelerate the development of Chinese natural language processing.
Downloads 96.54k
Release Time : 3/2/2022
Model Overview
Chinese Whole Word Masking BERT is a pretrained language model specifically designed for Chinese natural language processing tasks, utilizing whole word masking technology to enhance model performance.
Model Features
Whole Word Masking Technology
Employs whole word masking technology to improve the performance of Chinese pretrained models.
Chinese Optimization
Specifically optimized for Chinese natural language processing tasks.
Pretrained Model
Provides pretrained models that can be directly used for downstream tasks.
Model Capabilities
Text Classification
Named Entity Recognition
Question Answering System
Text Generation
Use Cases
Natural Language Processing
Chinese Text Classification
Can be used for tasks such as sentiment analysis and topic classification of Chinese texts.
Named Entity Recognition
Can be used to identify entities such as person names, locations, and organization names in Chinese texts.
Featured Recommended AI Models
Š 2025AIbase