Distilbert Base Uncased Agnews Student
D
Distilbert Base Uncased Agnews Student
Developed by joeddav
This model is obtained through distillation from a zero-shot classification pipeline for AG News, primarily demonstrating how to distill an expensive zero-shot model into a more efficient student model.
Downloads 900
Release Time : 3/2/2022
Model Overview
This model is distilled from an unlabeled AG News dataset via a zero-shot classification pipeline and can serve as an alternative for AG News classification tasks, though its performance may not match fully supervised models.
Model Features
Distillation Technique
Obtained via distillation from a zero-shot classification pipeline, retaining partial performance of the teacher model.
Case Insensitive
The model is case-insensitive, suitable for input text in various case formats.
Efficient Inference
Compared to the original zero-shot model, the distilled student model offers higher inference efficiency.
Model Capabilities
AG News Classification
Text Classification
Use Cases
News Classification
AG News Auto-Classification
Automatically classify news text into predefined AG News categories.
Educational Demonstration
Model Distillation Technique Demo
Demonstrate how to distill an expensive zero-shot model into a more efficient student model.
Featured Recommended AI Models
Š 2025AIbase