D

Distilbert Reuters21578

Developed by tarekziade
A DistilBERT-based Reuters-21578 multi-label news classification model, fine-tuned on the ModApte configured dataset, suitable for English news topic classification.
Downloads 30
Release Time : 12/17/2023

Model Overview

This model is a DistilBERT variant fine-tuned on the Reuters-21578 dataset, specifically designed for multi-label text classification tasks, capable of identifying multiple relevant topics in news articles.

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture, it significantly reduces model size while maintaining high performance.
Multi-label Classification
Supports predicting multiple relevant topic labels for news articles simultaneously.
Precision Priority
The model prioritizes precision over recall, making it suitable for applications requiring high accuracy.

Model Capabilities

English News Classification
Multi-label Prediction
Topic Identification

Use Cases

News Analysis
News Topic Tagging
Automatically tag news articles with relevant topic labels
Achieved an F1 score of 0.86 on the Reuters-21578 test set
Content Classification System
Build an automatic classification module for news content management systems
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase