D

Distilbert Base Uncased Mnli

Developed by typeform
DistilBERT is a distilled version of BERT that retains 97% of BERT's performance while being 40% smaller and 60% faster.
Downloads 74.81k
Release Time : 3/2/2022

Model Overview

DistilBERT is a lightweight model based on BERT, trained using knowledge distillation techniques, suitable for various natural language processing tasks.

Model Features

Lightweight and Efficient
40% smaller in size and 60% faster in inference compared to the original BERT model
High Performance
Retains 97% of the performance of the BERT model
Multi-task Support
Suitable for various natural language processing tasks

Model Capabilities

Text classification
Zero-shot classification
Natural language understanding

Use Cases

Text Analysis
Sentiment Analysis
Analyze the sentiment tendency of text
High-accuracy sentiment classification
Topic Classification
Classify text into predefined categories
Customer Service
Intent Recognition
Identify the intent of user queries
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase