Distilbert Base Uncased Ag News
A lightweight text classification model based on the DistilBERT architecture, fine-tuned on the AG News dataset using the TextAttack framework, suitable for news classification tasks.
Downloads 544
Release Time : 3/2/2022
Model Overview
This model is a fine-tuned version of DistilBERT, specifically designed for sequence classification tasks, particularly excelling in news classification scenarios.
Model Features
Efficient and Lightweight
Based on the DistilBERT architecture, it reduces the size by 40% while maintaining 95% of BERT's performance.
High Accuracy
Achieves 94.79% accuracy on the AG News validation set.
Adversarial Training
Trained using the TextAttack framework, potentially offering better adversarial robustness.
Model Capabilities
English Text Classification
News Category Prediction
Short Text Analysis
Use Cases
News Media
Automatic News Classification
Automatically categorizes news articles into predefined categories
Validation accuracy 94.79%
Content Management
Content Moderation Assistance
Helps identify and categorize user-generated content
Featured Recommended AI Models