Distilbart Xsum 12 3
DistilBART is a distilled version of the BART model, specifically optimized for summarization tasks, significantly reducing model parameters and inference time while maintaining high performance.
Downloads 579
Release Time : 3/2/2022
Model Overview
A lightweight summarization model based on the BART architecture, compressed from the original model using knowledge distillation techniques, suitable for text compression scenarios such as news summarization.
Model Features
Efficient Inference
Achieves 2.54x speedup compared to the original BART model, significantly reducing computational resource requirements.
Performance Balance
Strikes a good balance between model size and summarization quality (Rouge scores).
Multiple Variants Available
Offers variants with different parameter scales to accommodate various hardware conditions.
Model Capabilities
News Summarization
Long Text Compression
Key Information Extraction
Use Cases
Media Industry
Automatic News Summarization
Compress lengthy news reports into concise summaries
Achieves 21.37 Rouge-2 score on the XSum dataset
Knowledge Management
Document Summarization
Automatically generate summaries for technical documents or research reports
Featured Recommended AI Models
Š 2025AIbase