Deberta V2 Xlarge
DeBERTa V2 XXLarge is an enhanced natural language understanding model developed by Microsoft, which improves the BERT architecture through a disentangled attention mechanism and enhanced masked decoder, achieving SOTA performance on multiple NLP tasks.
Downloads 116.71k
Release Time : 3/2/2022
Model Overview
An enhanced BERT variant based on the disentangled attention mechanism, focusing on natural language understanding tasks, supporting downstream tasks such as text classification, question answering, and semantic similarity.
Model Features
Disentangled Attention Mechanism
Separates content and positional information processing, enhancing the model's ability to understand the relationship between position and content.
Enhanced Masked Decoder
An improved masked language modeling objective that better captures dependencies of masked tokens.
Large-scale Pretraining
Pretrained using 160GB of high-quality text data to learn deep language representations.
Model Capabilities
Text Classification
Question Answering System
Semantic Similarity Calculation
Natural Language Inference
Language Acceptability Judgment
Use Cases
Intelligent Customer Service
Question Intent Classification
Automatically identifies the intent category of user questions.
Achieves 91.7% accuracy on the MNLI dataset.
Educational Assessment
Grammar Correctness Judgment
Evaluates the grammatical correctness of student writing.
Achieves an MCC score of 72.0 on the CoLA dataset.
Featured Recommended AI Models
Š 2025AIbase