M

Mbart Large 50

Developed by facebook
mBART-50 is a multilingual sequence-to-sequence model based on multilingual denoising pre-training, supporting translation tasks in 50 languages
Downloads 69.04k
Release Time : 3/2/2022

Model Overview

This model achieves multilingual translation through multilingual fine-tuning, supporting mutual translation between 50 languages, built using multilingual denoising pre-training methods

Model Features

Multilingual support
Supports mutual translation between 50 languages, covering a wide range of language needs
Denoising pre-training
Uses innovative text fragment masking scheme for pre-training to improve model robustness
Multilingual joint fine-tuning
Simultaneously fine-tunes multiple translation directions instead of traditional single-direction fine-tuning

Model Capabilities

Multilingual machine translation
Sequence-to-sequence task processing
Cross-lingual text generation

Use Cases

Machine translation
Multilingual document translation
Translate documents from one language to multiple other languages
Supports high-quality mutual translation between 50 languages
Cross-language information retrieval
Helps users obtain relevant information in different languages
Breaks language barriers and improves information acquisition efficiency
Language learning
Language learning assistant tool
Provides instant translation reference for language learners
Helps understand expressions in different languages
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase