B

Bert Small Mnli

Developed by prajjwal1
This model is a PyTorch pretrained model obtained by converting TensorFlow checkpoints from the official Google BERT repository, originating from the paper 'Well-Read Students Learn Better: On the Importance of Pre-training Compact Models,' and trained on the MNLI dataset.
Downloads 29
Release Time : 3/2/2022

Model Overview

This BERT variant is primarily used for natural language inference tasks, specifically optimized for the MNLI dataset.

Model Features

Trained on MNLI Dataset
The model underwent 4 rounds of training on the MNLI dataset, specifically optimized for natural language inference tasks.
Converted from TensorFlow
The model was converted from TensorFlow checkpoints in the official Google BERT repository, ensuring compatibility with the original BERT model.
Paper Support
The model originates from the paper 'Well-Read Students Learn Better: On the Importance of Pre-training Compact Models,' providing an academic research background.

Model Capabilities

Natural Language Inference
Text Classification

Use Cases

Natural Language Processing
Textual Entailment Recognition
Determine the logical relationship between two sentences (entailment, contradiction, or neutral).
Achieved 72.1% accuracy on the MNLI dataset and 73.76% accuracy on the MNLI-mm dataset.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase