D

Distilbert Base Uncased Mnli

Developed by optimum
A case-insensitive DistilBERT model fine-tuned on the Multi-Genre Natural Language Inference (MNLI) dataset, optimized for zero-shot classification tasks
Downloads 53
Release Time : 3/24/2022

Model Overview

This is a lightweight natural language understanding model specifically designed for text classification and natural language inference tasks, particularly suitable for zero-shot classification scenarios.

Model Features

Lightweight architecture
Based on DistilBERT, smaller and faster than standard BERT models while maintaining good performance
Zero-shot classification capability
Can perform classification tasks without domain-specific training
Case-insensitive
Insensitive to input text case, providing more flexible processing

Model Capabilities

Text classification
Natural language inference
Zero-shot learning

Use Cases

Text analysis
Sentiment analysis
Analyze the sentiment tendency of text
Topic classification
Classify text into predefined topic categories
Content moderation
Inappropriate content detection
Identify inappropriate or sensitive content in text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase