Bart Large Mnli Bewgle
BART-large-MNLI is a sequence classification model trained by Facebook, based on the BART architecture, specifically designed for natural language inference tasks.
Downloads 23
Release Time : 3/2/2022
Model Overview
This model is based on the BART architecture and fine-tuned on the Multi-Genre Natural Language Inference (MNLI) task, capable of determining the logical relationship (entailment, contradiction, or neutrality) between two sentences.
Model Features
Multi-Genre Natural Language Inference
Optimized for the MNLI task, capable of identifying entailment/contradiction/neutrality relationships between texts
Sequence-to-Sequence Architecture
Based on BART's encoder-decoder structure, combining comprehension and generation capabilities
Zero-shot Classification Capability
Can achieve zero-shot text classification through template prompts (requires additional processing)
Model Capabilities
Natural Language Inference
Zero-shot Classification
Textual Relationship Judgment
Use Cases
Text Analysis
Contract Clause Comparison
Automatically detect logical consistency between clauses in two contracts
Accurately identifies approximately 90% of clause relationships (based on MNLI test set)
Content Moderation
User Feedback Classification
Determine sentiment tendencies in user feedback through zero-shot classification
Featured Recommended AI Models