D

Debertav3 Mnli Snli Anli

Developed by NDugar
DeBERTa is an enhanced BERT decoding model based on the disentangled attention mechanism, which improves upon BERT and RoBERTa models and performs better in most natural language understanding tasks.
Downloads 26
Release Time : 3/2/2022

Model Overview

DeBERTa V2 XXLarge is a large-scale natural language understanding model with 1.5 billion parameters, employing a disentangled attention mechanism and an enhanced masked decoder, achieving excellent performance on multiple GLUE benchmark tasks.

Model Features

Disentangled Attention Mechanism
Enhances the model's contextual understanding by separating content and position attention calculations.
Enhanced Masked Decoder
Improved masking mechanism helps the model better handle masked tokens.
Large-scale Pretraining
Pretrained on 160GB of raw data with a parameter scale of 1.5 billion.

Model Capabilities

Text Classification
Natural Language Inference
Question Answering System
Semantic Similarity Calculation

Use Cases

Natural Language Understanding
Textual Entailment Judgment
Determine the logical relationship between two sentences (entailment/contradiction/neutral).
Achieved 91.7/91.9 accuracy on the MNLI task.
Sentiment Analysis
Analyze the sentiment tendency of text.
Achieved 97.2% accuracy on the SST-2 task.
Question Answering System
Open-domain Question Answering
Answer questions based on text content.
Achieved 92.2/89.7 F1/EM scores on SQuAD 2.0.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase