D

Deberta V2 Xlarge Mnli

Developed by microsoft
DeBERTa V2 XLarge is an enhanced natural language understanding model developed by Microsoft, which improves the BERT architecture through a disentangled attention mechanism and enhanced masked decoder, outperforming BERT and RoBERTa on multiple NLU tasks.
Downloads 51.59k
Release Time : 3/2/2022

Model Overview

An improved BERT architecture based on the disentangled attention mechanism, focusing on natural language understanding tasks, with excellent performance on benchmarks such as GLUE and SQuAD.

Model Features

Disentangled Attention Mechanism
Separates content and position attention calculations, enhancing the model's ability to understand textual positional relationships.
Enhanced Masked Decoder
Improved masked language modeling objective to better capture contextual dependencies.
Large-scale Pretraining
Pretrained on 80GB of training data to learn richer language representations.

Model Capabilities

Text Classification
Question Answering System
Semantic Similarity Calculation
Natural Language Inference
Sentence Pair Classification

Use Cases

Text Understanding
Sentiment Analysis
Analyze text sentiment (positive/negative).
Achieved 97.5% accuracy on the SST-2 dataset.
Question Answering System
Answer questions based on given text.
Achieved F1 score of 91.4 and EM score of 88.9 on SQuAD 2.0.
Semantic Analysis
Semantic Similarity Judgment
Determine the semantic similarity between two sentences.
Pearson correlation coefficient of 92.9 on the STS-B dataset.
Natural Language Inference
Determine logical relationships between texts (entailment/contradiction/neutral).
Accuracy of 91.7% (matched)/91.6% (mismatched) on the MNLI dataset.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase