Mbart Large Cc25
mbart-large-cc25 is a multilingual pre-trained sequence-to-sequence model developed by Facebook, supporting translation tasks between 25 languages
Machine Translation
Transformers Supports Multiple Languages#Multilingual Translation#Pre-trained Models#Zero-shot Learning

Downloads 15.92k
Release Time : 3/2/2022
Model Overview
A Transformer-based multilingual machine translation model suitable for various language pairs, also fine-tunable for text summarization
Model Features
Multilingual Support
Supports translation tasks between 25 languages, covering European, Asian, and other languages
Pre-trained Model
Pre-trained on extensive multilingual data, ready for downstream tasks or further fine-tuning
Sequence-to-Sequence Architecture
Transformer-based encoder-decoder structure, ideal for sequence generation tasks
Model Capabilities
Machine Translation
Text Summarization
Multilingual Text Generation
Use Cases
Language Services
Multilingual Document Translation
Translate business documents, technical documents, etc., between different languages
News Summarization
Generate automatic summaries for multilingual news content
Education
Language Learning Assistance
Provide translation assistance for language learners
Featured Recommended AI Models