B

Bart Large Cnn

Developed by facebook
BART model pre-trained on English corpus, specifically fine-tuned for the CNN/Daily Mail dataset, suitable for text summarization tasks
Downloads 3.8M
Release Time : 3/2/2022

Model Overview

This model adopts the Transformer encoder-decoder architecture, excelling in text generation and comprehension tasks through denoising sequence-to-sequence pre-training. The current version is specifically optimized for news summarization capabilities

Model Features

Bidirectional encoder structure
Incorporates BERT-style bidirectional encoders to fully understand contextual semantics
Autoregressive decoder
GPT-like autoregressive generation capability ensures fluent text generation
Domain-specific fine-tuning
Specially optimized on the CNN/Daily Mail news dataset, delivering outstanding summarization performance

Model Capabilities

News text summarization
Long text compression
Key information extraction

Use Cases

News media
News briefing generation
Automatically compresses lengthy news reports into concise summaries
ROUGE-L score 30.6186 (CNN/Daily Mail test set)
Content preview generation
Automatically generates article previews for online news platforms
Average generated text length of 78.6 words
Information processing
Document summarization
Extracts key information from long documents
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase