Ag News Distilbert
A
Ag News Distilbert
Developed by huggingfacecoin
A text classification model fine-tuned on distilbert-base-uncased, trained on the AG News dataset with an accuracy of 89.06%
Downloads 31
Release Time : 5/17/2025
Model Overview
This model is a fine-tuned version of DistilBERT, specifically designed for news classification tasks, capable of categorizing news texts into predefined categories
Model Features
Efficient and Lightweight
Based on the DistilBERT architecture, it significantly reduces model size while maintaining high performance
High Accuracy
Achieves a classification accuracy of 89.06% on the AG News dataset
Fast Inference
Faster inference speed compared to the full BERT model
Model Capabilities
News text classification
English text understanding
Short text classification
Use Cases
News Media
Automatic News Classification
Automatically categorizes news articles into predefined categories
89.06% accuracy
Content Management
Automatic Content Tagging
Automatically adds classification tags to news content on websites or applications
Featured Recommended AI Models
Š 2025AIbase