B

Bert Base Uncased Sst2 Distilled

Developed by doyoungkim
This model is a fine-tuned version of bert-base-uncased on an unknown dataset, primarily used for text classification tasks.
Downloads 106
Release Time : 3/2/2022

Model Overview

This is a distilled BERT model based on the bert-base-uncased architecture, fine-tuned on the SST-2 (Stanford Sentiment Treebank) dataset for sentiment analysis tasks.

Model Features

Distilled Model
Learns from a larger teacher model through knowledge distillation, reducing model size while maintaining performance.
High Accuracy
Achieves an accuracy of 90.25% on the evaluation set, demonstrating excellent performance.
Efficient Fine-tuning
Fine-tuned based on the pre-trained bert-base-uncased model, ensuring high training efficiency.

Model Capabilities

Text Classification
Sentiment Analysis
Natural Language Understanding

Use Cases

Sentiment Analysis
Product Review Sentiment Classification
Analyze whether user reviews of a product are positive or negative.
Accuracy reaches 90.25%
Social Media Sentiment Monitoring
Monitor the emotional tendencies of users on social media regarding specific topics.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase