Ernie 2.0 Base En
ERNIE 2.0 is a continuous pre-training framework proposed by Baidu in 2019, which gradually constructs and optimizes pre-training tasks through continuous multi-task learning. It outperforms BERT and XLNet in multiple tasks.
Downloads 1,694
Release Time : 3/2/2022
Model Overview
ERNIE 2.0 is a continuous learning pre-training framework that improves model performance by gradually introducing and optimizing pre-training tasks, suitable for various natural language understanding tasks.
Model Features
Continuous Pre-training Framework
Gradually constructs and optimizes pre-training tasks through continuous multi-task learning, continuously improving model performance.
Multi-task Learning
Supports joint learning of multiple pre-training tasks, enhancing the model's generalization ability.
Superior Performance
Outperforms models like BERT and XLNet in the GLUE benchmark and multiple Chinese tasks.
Model Capabilities
Text Understanding
Text Classification
Natural Language Inference
Question Answering System
Use Cases
Natural Language Processing
Text Classification
Used for classifying text, such as sentiment analysis, topic classification, etc.
Performs excellently in the GLUE benchmark.
Natural Language Inference
Used to determine the logical relationship between two pieces of text.
Outperforms BERT and XLNet in multiple tasks.
Featured Recommended AI Models