Distilbert Base Uncased Subj Train 8 0
A text classification model fine-tuned based on distilbert-base-uncased, achieving 78.9% accuracy on the evaluation set
Downloads 26
Release Time : 3/2/2022
Model Overview
This model is a fine-tuned version of the DistilBERT base model, primarily used for text classification tasks, with no specific application scenario explicitly stated
Model Features
Efficient and Lightweight
Based on the DistilBERT architecture, it is more lightweight and efficient than the full BERT model
Fast Training
Uses mixed-precision training (native AMP) to accelerate the training process
Linear Learning Rate Scheduling
Adopts a linear learning rate scheduling strategy to optimize the training process
Model Capabilities
Text Classification
Natural Language Understanding
Use Cases
Text Analysis
Sentiment Analysis
Can be used to analyze the sentiment tendency of text
Topic Classification
Can be used to classify text into predefined topic categories
Featured Recommended AI Models
Š 2025AIbase