Distilbert Base Uncased Finetune
A fine-tuned version based on the DistilBERT base model, suitable for text classification tasks
Downloads 15
Release Time : 4/29/2022
Model Overview
This model is a fine-tuned version based on distilbert-base-uncased, mainly used for text classification tasks. Although the specific training dataset is not clearly stated, the evaluation results show that it achieves an accuracy of 97.15% on specific tasks.
Model Features
Efficient and lightweight
Based on the DistilBERT architecture, it is 40% smaller than the standard BERT model while retaining 97% of the language understanding ability
High accuracy
Achieves an accuracy of 97.15% on the evaluation set
Fast training
The distillation architecture makes the model training and inference faster
Model Capabilities
Text classification
Natural language understanding
Sentiment analysis (inferred)
Intent recognition (inferred)
Use Cases
Text analysis
Sentiment analysis
Analyze the sentiment tendency in the text
Accuracy 97.15% (inferred based on evaluation data)
Content classification
Classify text content into predefined categories
Featured Recommended AI Models
Š 2025AIbase