D

Distilbert Base Uncased Mnli

Developed by Xenova
A lightweight version of DistilBERT specifically designed for Multi-Genre Natural Language Inference (MNLI) tasks, based on the BERT architecture but with fewer parameters and higher efficiency.
Downloads 1,215
Release Time : 5/2/2023

Model Overview

This model is a lightweight natural language processing model based on the DistilBERT architecture, optimized for Multi-Genre Natural Language Inference (MNLI) tasks and suitable for zero-shot classification scenarios.

Model Features

Lightweight and efficient
Reduces parameters by 40% compared to the original BERT model while retaining 97% of its performance.
Zero-shot classification
Can be directly applied to text classification tasks without fine-tuning.
ONNX compatible
Provides ONNX format weights for optimized web-based deployment.

Model Capabilities

Text classification
Natural language inference
Zero-shot learning

Use Cases

Text analysis
Sentiment analysis
Classifies text as positive/negative sentiment without training
Topic classification
Automatically categorizes documents based on custom labels
Customer support
Ticket classification
Automatically categorizes customer inquiries into predefined categories
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase