D

Deberta V2 Xxlarge Mnli

Developed by microsoft
DeBERTa V2 XXLarge is an enhanced BERT variant based on the disentangled attention mechanism, surpassing RoBERTa and XLNet in natural language understanding tasks with 1.5 billion parameters
Downloads 4,077
Release Time : 3/2/2022

Model Overview

A pre-trained language model improved through disentangled attention mechanism and enhanced masked decoder, specifically fine-tuned for MNLI tasks, suitable for various natural language understanding tasks

Model Features

Disentangled Attention Mechanism
Separates content and positional attention calculations, enhancing the model's sensitivity to positional information
Enhanced Masked Decoder
Improved masked language modeling objective to better capture absolute positional information of masked tokens
Large-scale Pre-training
Trained on 80GB of data, achieving SOTA performance on multiple NLU tasks

Model Capabilities

Natural Language Inference
Text Classification
Question Answering System
Semantic Similarity Calculation

Use Cases

Text Understanding
Sentiment Analysis
Analyze text sentiment orientation
Achieved 97.2% accuracy on SST-2 dataset
Question Answering System
Open-domain question answering tasks
F1 score 92.2/EM 89.7 on SQuAD 2.0
Semantic Analysis
Textual Entailment Recognition
Determine logical relationships between texts
93.5% accuracy on RTE task
Semantic Similarity Calculation
Calculate semantic similarity between sentences
Pearson correlation coefficient 93.2 on STS-B
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase