Distilbart Xsum 12 1
DistilBART is a distilled version of the BART model, focusing on text summarization tasks, significantly reducing model parameters and inference time while maintaining high performance.
Downloads 396
Release Time : 3/2/2022
Model Overview
A distilled model based on the BART architecture, specifically designed for generating concise summaries of news articles, balancing model size and summary quality.
Model Features
Efficient Inference
2.54x faster inference speed compared to the original BART-large model (XSUM task).
Parameter Optimization
45% reduction in parameters (222M vs 406M) while maintaining near-baseline summary quality.
Multiple Configuration Options
Provides model variants with different layer configurations to meet various performance needs.
Model Capabilities
News Summarization
Long Text Compression
Key Information Extraction
Use Cases
News Media
Automatic News Summarization
Generates concise bullet-point summaries for lengthy news reports.
Achieves Rouge-L 33.37 on the XSUM dataset.
Content Analysis
Document Key Information Extraction
Extracts core content from long documents.
Featured Recommended AI Models
Š 2025AIbase