B

Bart Large

Developed by facebook
BART is a Transformer sequence-to-sequence model that combines a bidirectional encoder and an autoregressive decoder, suitable for text generation and understanding tasks.
Downloads 119.86k
Release Time : 3/2/2022

Model Overview

The BART model performs excellently in natural language generation, translation, and understanding tasks through denoising sequence-to-sequence pretraining.

Model Features

Combination of Bidirectional and Autoregressive
Combines a BERT-style bidirectional encoder and a GPT-style autoregressive decoder, having both understanding and generation capabilities
Denoising Pretraining
Enhances the model's robustness to noisy data through a pretraining method of destroying and reconstructing text
Multi-Task Adaptability
Performs excellently in both generation tasks (such as summarization) and understanding tasks (such as classification)
Multi-Task Adaptability
Performs excellently in both generation tasks (such as summarization) and understanding tasks (such as classification)

Model Capabilities

Text Generation
Machine Translation
Text Summarization
Text Classification
Question Answering System
Text Filling

Use Cases

Text Generation
Automatic Summarization
Generate a concise summary for long text
Generate a fluent summary that retains key information
Machine Translation
Cross-Language Translation
Translate one language into another
Produce accurate and natural translation results
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase