D

Deberta Base

Developed by kamalkraj
DeBERTa is an enhanced BERT decoding model based on the disentangled attention mechanism, improving upon BERT and RoBERTa models, and excels in natural language understanding tasks.
Downloads 287
Release Time : 3/2/2022

Model Overview

DeBERTa improves upon BERT and RoBERTa models through a disentangled attention mechanism and an enhanced masked decoder, making it suitable for natural language understanding tasks.

Model Features

Disentangled Attention Mechanism
Improves upon BERT and RoBERTa models through a disentangled attention mechanism, enhancing model performance.
Enhanced Masked Decoder
Utilizes an enhanced masked decoder to further optimize the model's decoding capability.
High Performance
Trained on 80GB of data, DeBERTa outperforms BERT and RoBERTa in most natural language understanding tasks.

Model Capabilities

Natural Language Understanding
Text Classification
Question Answering Systems

Use Cases

Natural Language Processing
Question Answering System
DeBERTa can be used to build high-performance question answering systems, such as SQuAD 1.1/2.0 tasks.
Achieves 93.1/87.2 on SQuAD 1.1 and 86.2/83.1 on SQuAD 2.0.
Text Classification
DeBERTa can be used for text classification tasks, such as MNLI tasks.
Achieves 88.8 on MNLI-m.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase