B

Bart Base

Developed by facebook
BART is a Transformer model combining a bidirectional encoder and an autoregressive decoder, suitable for text generation and understanding tasks.
Downloads 2.1M
Release Time : 3/2/2022

Model Overview

Through denoising sequence-to-sequence pre-training, the BART model excels in text generation (e.g., summarization, translation) and understanding tasks (e.g., classification, QA).

Model Features

Combination of Bidirectional Encoder and Autoregressive Decoder
Integrates BERT-style bidirectional encoder and GPT-style autoregressive decoder, combining comprehension and generation capabilities.
Denoising Pre-training
Pre-trained by corrupting text and learning to reconstruct the original, enhancing model robustness.

Model Capabilities

Text Generation
Text Summarization
Machine Translation
Text Classification
Question Answering

Use Cases

Text Generation
Automatic Summarization
Generates concise summaries from long texts
Machine Translation
Translates text between different languages
Text Understanding
Text Classification
Labels text with classification tags
Question Answering
Answers questions based on text content
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase