Chinese Pretrain Mrc Roberta Wwm Ext Large
C
Chinese Pretrain Mrc Roberta Wwm Ext Large
Developed by luhua
The roberta_wwm_ext_large model trained on massive Chinese reading comprehension data shows significant improvement in tasks like reading comprehension and text classification.
Downloads 790
Release Time : 3/2/2022
Model Overview
This model is a Chinese pre-trained model based on the RoBERTa-wwm-ext-large architecture, specifically optimized for Chinese reading comprehension tasks, achieving excellent results in multiple Chinese reading comprehension competitions.
Model Features
Chinese Reading Comprehension Optimization
Specially optimized for Chinese reading comprehension tasks, achieving outstanding performance in multiple competitions.
Based on RoBERTa-wwm-ext-large Architecture
Utilizes the RoBERTa large model architecture with Whole Word Masking (WWM) technology.
Competition-Validated Performance
Helped developers secure top-five rankings in competitions like Dureader-2021.
Model Capabilities
Chinese Text Understanding
Q&A System Construction
Text Classification
Machine Reading Comprehension
Use Cases
Competition Applications
Dureader-2021 Competition
Used for Chinese machine reading comprehension competitions
Helped multiple developers achieve top-five rankings
Medical Q&A
Tencent Medical Q&A System
Applied in medical field Q&A systems
Achieved 83.1% accuracy on the test set
Featured Recommended AI Models
Š 2025AIbase