Distilbert Base Uncased Finetuned Emotion
A lightweight sentiment analysis model based on DistilBERT, fine-tuned on emotion datasets with an accuracy of 92.95%
Downloads 20
Release Time : 3/2/2022
Model Overview
This model is a fine-tuned version of DistilBERT, specifically designed for text sentiment classification tasks. It retains the main performance of BERT through distillation technology while reducing model size and computational requirements.
Model Features
Efficient and Lightweight
Utilizes distillation technology, making the model 40% smaller than standard BERT and 60% faster in inference while retaining 95% of the performance.
High Accuracy
Achieves 92.95% accuracy and 93.0% F1 score in sentiment classification tasks.
Fast Fine-Tuning
Requires only 2 training epochs to achieve excellent performance, with training loss decreasing from 0.2853 to 0.1568.
Model Capabilities
Text Sentiment Analysis
Short Text Classification
Sentiment Polarity Judgment
Use Cases
Social Media Analysis
User Comment Sentiment Analysis
Analyze the sentiment tendencies of social media comments
Accurately identifies 92.95% of comment sentiments
Customer Service
Customer Feedback Classification
Automatically classify the sentiment tendencies of customer feedback
Helps quickly identify negative feedback
Featured Recommended AI Models
Š 2025AIbase