# Lightweight BART
Distilbart Multi News 12 6 2
Apache-2.0
DistilBART-CNN-12-6 is a lightweight summarization generation model based on the BART architecture, specifically optimized for the CNN/Daily Mail dataset.
Text Generation
Transformers English

D
Angel0J
313
0
Distilbart Cnn 6 6
Apache-2.0
A lightweight text summarization model based on the BART architecture, compressing the original model size through knowledge distillation while retaining core summarization capabilities
Text Generation
Transformers

D
Xenova
1,260
7
MLQ Distilbart Bbc
Apache-2.0
This model is a text summarization model fine-tuned on the BBC News Summary dataset based on sshleifer/distilbart-cnn-12-6, developed by the Deep Natural Language Processing Course Lab at Politecnico di Torino.
Text Generation
Transformers

M
DeepNLP-22-23
20
0
Distilbart Cnn 12 6 Finetuned Resume Summarizer
A resume summary generation model fine-tuned based on distilbart-cnn-12-6, performing well on ROUGE metrics
Text Generation
Transformers

D
Ameer05
19
0
Distilbart Xsum 9 6
Apache-2.0
DistilBART is a distilled version of the BART model, focusing on text summarization tasks, significantly improving inference speed while maintaining high performance.
Text Generation English
D
sshleifer
421
0
Bart Base Cnn R2 19.4 D35 Hybrid
Apache-2.0
This is a pruned and optimized BART-base model specifically designed for summarization tasks, retaining 53% of the original model's weights.
Text Generation
Transformers English

B
echarlaix
20
0
Distilbart Xsum 12 3
Apache-2.0
DistilBART is a distilled version of the BART model, specifically optimized for summarization tasks, significantly reducing model parameters and inference time while maintaining high performance.
Text Generation English
D
sshleifer
579
11
Bart Base Cnn R2 18.7 D23 Hybrid
Apache-2.0
This is a pruned and optimized BART-base model, specifically fine-tuned on the CNN/DailyMail dataset for summarization tasks.
Text Generation
Transformers English

B
echarlaix
18
0
Featured Recommended AI Models