D

Deberta Xlarge Mnli

Developed by microsoft
DeBERTa-XLarge-MNLI is an enhanced BERT model based on the disentangled attention mechanism, fine-tuned on the MNLI task with 750M parameters, excelling in natural language understanding tasks.
Downloads 833.58k
Release Time : 3/2/2022

Model Overview

DeBERTa improves upon BERT and RoBERTa models through its disentangled attention mechanism and enhanced masked decoder. Trained on 80GB of data, it surpasses BERT and RoBERTa in most natural language understanding tasks.

Model Features

Disentangled Attention Mechanism
Improves BERT and RoBERTa models through a disentangled attention mechanism, enhancing performance in natural language understanding tasks.
Enhanced Masked Decoder
Utilizes an enhanced masked decoder to further boost the model's performance.
Large-scale Training Data
Trained on 80GB of data, the model demonstrates excellent performance across multiple natural language understanding tasks.

Model Capabilities

Natural Language Understanding
Text Classification
Semantic Similarity Calculation

Use Cases

Natural Language Processing
Textual Entailment Recognition
Identifies the logical relationship between two sentences (entailment, contradiction, or neutral).
Achieves 91.5/91.2 accuracy (matched/mismatched) on the MNLI task.
Semantic Similarity Calculation
Computes the semantic similarity between two sentences.
Achieves Pearson/Spearman correlation coefficients of 92.9/92.7 on the STS-B task.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase