V

V3large 1epoch

Developed by NDugar
DeBERTa is an enhanced BERT decoder model based on the disentangled attention mechanism, excelling in natural language understanding tasks.
Downloads 32
Release Time : 3/2/2022

Model Overview

DeBERTa improves upon BERT and RoBERTa models through its disentangled attention mechanism and enhanced masked decoder, supporting various natural language understanding tasks.

Model Features

Disentangled Attention Mechanism
Enhances model comprehension by separating content and position attention mechanisms.
Enhanced Masked Decoder
Improved masked language modeling method increases training efficiency.
Large-scale Pretraining
Trained on 160GB of raw text data.
Outstanding Performance
Surpasses BERT and RoBERTa on multiple GLUE benchmark tasks.

Model Capabilities

Text Classification
Natural Language Inference
Question Answering Systems
Semantic Similarity Calculation

Use Cases

Text Analysis
Sentiment Analysis
Analyze the sentiment tendency of text.
Achieves 97.2% accuracy on the SST-2 dataset.
Natural Language Inference
Determine the logical relationship between two texts.
Achieves 91.7%/91.9% accuracy on the MNLI dataset.
Question Answering Systems
Reading Comprehension
Answer questions based on text.
Achieves 92.2/89.7 F1/EM scores on SQuAD 2.0.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase