V

V2xl Again Mnli

Developed by NDugar
DeBERTa is an enhanced BERT decoding model based on the disentangled attention mechanism. By improving the attention mechanism and masked decoder, it surpasses the performance of BERT and RoBERTa in multiple natural language understanding tasks.
Downloads 30
Release Time : 3/2/2022

Model Overview

DeBERTa improves upon BERT and RoBERTa models through its disentangled attention mechanism and enhanced masked decoder. Trained on 80GB of data, it excels in most natural language understanding tasks.

Model Features

Disentangled Attention Mechanism
Improves the attention calculation method of traditional BERT models through the disentangled attention mechanism, enhancing model performance.
Enhanced Masked Decoder
Utilizes an enhanced masked decoder to further improve the model's performance in natural language understanding tasks.
Large-scale Training Data
Trained on 80GB of data, surpassing BERT and RoBERTa in multiple tasks.

Model Capabilities

Natural Language Understanding
Text Classification
Zero-shot Classification

Use Cases

Natural Language Processing
Text Classification
Can be used for text classification tasks such as MNLI.
Achieves 91.3/91.1 accuracy in MNLI tasks.
Question Answering System
Can be used to build question answering systems.
Achieves 95.5/90.1 F1/EM scores in SQuAD 1.1 tasks.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase