D

Distilbert Sentiment

Developed by appleboiy
DistilBERT is a lightweight distilled version of BERT, retaining 97% of BERT's performance while being 40% smaller in size.
Downloads 17
Release Time : 12/5/2024

Model Overview

DistilBERT is a lightweight model based on BERT, trained using knowledge distillation techniques, suitable for various natural language processing tasks such as text classification and sentiment analysis.

Model Features

Lightweight
40% smaller in size than the original BERT, with faster inference speed.
High Performance
Retains 97% of BERT's performance, suitable for various NLP tasks.
Knowledge Distillation
Learns from BERT through knowledge distillation, reducing training costs.

Model Capabilities

Text classification
Sentiment analysis
Zero-shot classification

Use Cases

Sentiment analysis
Movie review sentiment analysis
Analyze the sentiment tendency of movie reviews (positive or negative).
High-accuracy binary classification results.
Text classification
News classification
Classify news articles into predefined categories.
Supports multi-classification tasks.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase