Distilbart Cnn 6 6
DistilBART is a distilled version of the BART model, optimized for text summarization tasks, significantly improving inference speed while maintaining high performance.
Downloads 48.17k
Release Time : 3/2/2022
Model Overview
A distilled model based on the BART architecture for generative text summarization tasks, specifically optimized for news summarization scenarios.
Model Features
Efficient Inference
2.54x faster than the original BART with 45% fewer parameters
Multiple Configuration Options
Offers 6 variants with different parameter sizes to balance performance and efficiency
Dual Dataset Optimization
Versions specifically optimized for CNN/DailyMail and XSum datasets
Model Capabilities
Generative Text Summarization
Long Text Compression
News Key Point Extraction
Use Cases
News Media
Automatic News Summarization
Compress lengthy news reports into concise summaries
Achieves 22.12 ROUGE-2 score on the XSum dataset
Content Analysis
Document Key Information Extraction
Automatically extract core content from long documents
Achieves 21.26 ROUGE-2 score on the CNN/DailyMail dataset
Featured Recommended AI Models
Š 2025AIbase