D

Distilbart Mnli 12 3

Developed by valhalla
DistilBart-MNLI is a distilled version of bart-large-mnli using teacher-free distillation techniques, achieving performance close to the original model while being more lightweight.
Downloads 8,791
Release Time : 3/2/2022

Model Overview

This model is a natural language inference model based on the BART architecture, specifically designed for zero-shot classification tasks, serving as a lightweight version of bart-large-mnli.

Model Features

Teacher-free distillation technique
Employs Huggingface's teacher-free distillation method, achieving model compression through alternating layer copying.
Performance close to original model
Excels on the MNLI dataset, with the 12-6 layer version achieving 89.19% matched accuracy, close to the original model's 89.9%.
Multiple configurations available
Offers various layer configurations like 12-1, 12-3, 12-6, 12-9 to meet different performance needs.

Model Capabilities

Natural language inference
Zero-shot classification
Textual entailment recognition

Use Cases

Text classification
Sentiment analysis
Classifies text sentiment without fine-tuning
Content moderation
Harmful content detection
Identifies inappropriate or harmful content in text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase