M

Mbart Large 50 Finetuned Xlsum Summarization

Developed by skripsi-summarization-1234
mBART-large-50 is a multilingual sequence-to-sequence model supporting text summarization and generation tasks in 50 languages.
Downloads 28.54k
Release Time : 5/20/2025

Model Overview

mBART-large-50 is a multilingual pretrained model based on the BART architecture, specifically designed for sequence-to-sequence tasks such as text summarization and machine translation.

Model Features

Multilingual Support
Supports text summarization and generation tasks in 50 languages.
Sequence-to-Sequence Architecture
Based on the BART architecture, suitable for various sequence-to-sequence tasks.
Pretrained Model
Pretrained on large-scale multilingual data, offering strong generalization capabilities.

Model Capabilities

Text Summarization
Multilingual Text Generation
Sequence-to-Sequence Tasks

Use Cases

News Summarization
Multilingual News Summarization
Automatically summarizes long news articles into concise multilingual summaries.
Performs excellently on the xlsum dataset.
Document Summarization
Technical Document Summarization
Automatically generates concise summaries of technical documents.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase