Distilbert Base Uncased Finetuned Mnli
This model is a text classification model fine-tuned on the MNLI task of GLUE based on distilbert-base-uncased, primarily used for natural language inference tasks.
Downloads 23
Release Time : 3/2/2022
Model Overview
This is a fine-tuned DistilBERT model specifically designed for Natural Language Inference (NLI) tasks, capable of determining the relationship between two sentences (entailment, contradiction, or neutral).
Model Features
Efficient Inference
Based on the DistilBERT architecture, it is 40% smaller than the standard BERT model while retaining 97% of its performance.
High Accuracy
Achieves 82.06% accuracy on the MNLI evaluation set.
Lightweight
Fewer model parameters, suitable for deployment in resource-constrained environments.
Model Capabilities
Text Classification
Natural Language Inference
Sentence Relationship Judgment
Use Cases
Text Analysis
Text Entailment Judgment
Determine whether the premise sentence entails the hypothesis sentence.
82.06% accuracy
Content Moderation
Identify contradictory statements in user-generated content.
Intelligent Customer Service
Q&A System Validation
Verify whether the system's answer is consistent with the user's question.
Featured Recommended AI Models
Š 2025AIbase