M

Mbart Large Cc25

Developed by facebook
mbart-large-cc25 is a multilingual pre-trained sequence-to-sequence model developed by Facebook, supporting translation tasks between 25 languages
Downloads 15.92k
Release Time : 3/2/2022

Model Overview

A Transformer-based multilingual machine translation model suitable for various language pairs, also fine-tunable for text summarization

Model Features

Multilingual Support
Supports translation tasks between 25 languages, covering European, Asian, and other languages
Pre-trained Model
Pre-trained on extensive multilingual data, ready for downstream tasks or further fine-tuning
Sequence-to-Sequence Architecture
Transformer-based encoder-decoder structure, ideal for sequence generation tasks

Model Capabilities

Machine Translation
Text Summarization
Multilingual Text Generation

Use Cases

Language Services
Multilingual Document Translation
Translate business documents, technical documents, etc., between different languages
News Summarization
Generate automatic summaries for multilingual news content
Education
Language Learning Assistance
Provide translation assistance for language learners
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase