C

Chinese Roberta Wwm Ext Large

Developed by hfl
A Chinese pre-trained BERT model employing whole word masking strategy, designed to accelerate Chinese natural language processing research.
Downloads 30.27k
Release Time : 3/2/2022

Model Overview

This model is a Chinese pre-trained model based on the BERT architecture, trained with whole word masking strategy, suitable for various Chinese natural language processing tasks.

Model Features

Whole Word Masking Strategy
Employs whole word masking instead of character-level masking, better suited for Chinese language characteristics, enhancing model comprehension.
Chinese Optimization
Specifically optimized for Chinese language characteristics, excelling in Chinese NLP tasks.
Pre-trained Model
Provides pre-trained model weights that can be directly fine-tuned for downstream tasks.

Model Capabilities

Text Classification
Named Entity Recognition
Question Answering Systems
Text Similarity Calculation
Text Generation

Use Cases

Natural Language Processing
Sentiment Analysis
Used for analyzing sentiment tendencies in Chinese texts
Machine Reading Comprehension
Building Chinese question answering systems
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase