Bert Medium Mnli
This model is a PyTorch pretrained model obtained by converting TensorFlow checkpoints from the official Google BERT repository, trained on the MNLI dataset for natural language inference tasks.
Downloads 415
Release Time : 3/2/2022
Model Overview
This BERT variant originates from the paper 'A Student Who Reads Many Books Learns Better: On the Importance of Pretraining Compact Models', specifically optimized for natural language inference tasks and demonstrates excellent performance on the MNLI dataset.
Model Features
Efficient Pretraining
Converted from the official Google BERT model, inheriting BERT's efficient pretraining architecture.
Specialized for NLI
Specifically trained on the MNLI dataset, optimizing natural language inference capabilities.
Cross-Domain Generalization
Research shows the model performs well in cross-domain natural language inference tasks.
Model Capabilities
Natural Language Inference
Text Classification
Semantic Understanding
Use Cases
Natural Language Processing
Textual Entailment Recognition
Determining the logical relationship between two sentences (entailment, contradiction, or neutral)
Achieves 75.86% accuracy on the MNLI dataset
Cross-Domain Inference
Performing inference tasks on texts from different domains
Achieves 77.03% accuracy on the MNLI-mm dataset
Featured Recommended AI Models