M

Mbarthez

Developed by moussaKam
BARThez is a French sequence-to-sequence pre-trained model based on the BART architecture, particularly suitable for generative tasks such as abstractive summarization.
Downloads 1,032
Release Time : 3/2/2022

Model Overview

BARThez is a French sequence-to-sequence pre-trained model that is pre-trained by reconstructing corrupted input sentences. Unlike BERT-based French models, both the encoder and decoder of BARThez are pre-trained, making it especially suitable for generative tasks.

Model Features

Sequence-to-sequence pre-training
BARThez is pre-trained by reconstructing corrupted input sentences, with both encoder and decoder trained, making it suitable for generative tasks.
French language optimization
Specifically pre-trained for French text, using a 66GB corpus of raw French text.
Multi-version support
Offers both base and large model sizes to accommodate different computational resource needs.

Model Capabilities

Text generation
Abstractive summarization
Sequence-to-sequence tasks

Use Cases

Text generation
Abstractive summarization
Generate concise summaries of French texts.
Natural language processing
Text reconstruction
Reconstruct corrupted French sentences.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase