Chinese Pretrain Mrc Macbert Large
MacBERT-Large model trained on massive Chinese machine reading comprehension data, showing significant improvement in tasks like reading comprehension/text classification
Downloads 106
Release Time : 3/2/2022
Model Overview
MacBERT-Large is a pre-trained model optimized for Chinese machine reading comprehension tasks, based on the original MacBERT-large model from Harbin Institute of Technology with secondary training, demonstrating excellent performance in multiple competitions.
Model Features
Competition-level Performance
Helped multiple participants achieve top-five rankings in competitions like Dureader-2021
Optimized Model
Significant performance improvement over the original pre-trained model on multiple datasets
Multi-task Applicability
Suitable for various NLP tasks including machine reading comprehension and text classification
Model Capabilities
Chinese Text Understanding
Q&A System Construction
Text Classification
Machine Reading Comprehension
Use Cases
Competition Applications
Dureader-2021 Competition
Used for Chinese machine reading comprehension competitions
Helped multiple participants achieve top-five rankings
Daguan Cup-2021
Used for text processing competitions
Achieved an F1 score of 70.45 on the validation set
Medical Q&A
Tencent Medical Q&A
Used for medical domain Q&A systems
Achieved 83.4% accuracy on the test set
Featured Recommended AI Models
Š 2025AIbase