# Multilingual Summarization

Pegasus Xsum Finetuned Xlsum Summarization
PEGASUS-XSum is a pre-trained text summarization model based on the PEGASUS architecture, specifically optimized for extreme summarization (XSum) tasks.
Text Generation Transformers
P
skripsi-summarization-1234
11.89k
0
Mbart Large 50 Finetuned Xlsum Summarization
mBART-large-50 is a multilingual sequence-to-sequence model supporting text summarization and generation tasks in 50 languages.
Text Generation Transformers
M
skripsi-summarization-1234
28.54k
0
Csebuetnlp Mt5 Multilingual XLSum
A multilingual summarization model based on the mT5 architecture, supporting 45 languages, fine-tuned on the XL-Sum dataset.
Text Generation Supports Multiple Languages
C
MaagDeveloper
14
0
Amazon MultiLingual Review Summarization With Google Mt5 Small
Apache-2.0
This is an mT5-small model fine-tuned on a multilingual Amazon review dataset, specifically designed for generating product review summaries in English and German.
Text Generation Transformers Supports Multiple Languages
A
srvmishra832
21
0
Meta Llama 3.1 8B Instruct Summarizer
Apache-2.0
A multilingual text summarization model fine-tuned based on Llama 3.1, supporting English, Spanish, and Chinese, utilizing an optimized Transformer architecture and enhanced output through RLHF technology.
Large Language Model Transformers
M
raaec
305
2
Mlong T5 Large Sumstew
Apache-2.0
This is a multilingual, long-text (supports up to 16k input tokens) abstractive summarization model. Trained on the sumstew dataset, it can generate titles and summaries for given input documents.
Text Generation Transformers Supports Multiple Languages
M
Joemgu
103
9
Long T5 Base Sumstew
A summarization model based on the Long-T5 architecture, supporting multilingual text summarization tasks.
Text Generation Transformers Supports Multiple Languages
L
Joemgu
27
1
PISCES
PISCES is a pretrained multilingual summarization model that acquires language modeling, cross-lingual capability, and summarization skills through a three-stage pretraining process.
Text Generation Transformers
P
Krystalan
15
1
Mt5 M2o Chinese Simplified Crosssum
An mT5 many-to-one model fine-tuned on the CrossSum dataset, capable of summarizing texts written in multiple languages into Chinese (Simplified)
Text Generation Transformers Supports Multiple Languages
M
csebuetnlp
43
20
Bart CaPE Xsum
Bsd-3-clause
CaPE is a contrastive parameter ensemble method designed to reduce hallucination in abstractive summarization.
Text Generation Transformers English
B
praf-choub
22
0
Mt5 M2o Russian Crosssum
A multilingual summarization model based on the mT5 architecture that can summarize texts in multiple languages into Russian
Text Generation Transformers Supports Multiple Languages
M
csebuetnlp
66
3
Mt5 M2o Hindi Crosssum
A fine-tuned mT5 many-to-one summarization model based on the CrossSum dataset, supporting summarization of multilingual texts into Hindi
Text Generation Transformers Supports Multiple Languages
M
csebuetnlp
22
0
Mt5 M2m Crosssum
A multilingual summarization model fine-tuned on the CrossSum dataset, supporting cross-lingual summarization across 45 languages
Text Generation Transformers Supports Multiple Languages
M
csebuetnlp
57
8
Camembert2camembert Shared Finetuned French Summarization
This model is a French text summarization model based on the CamemBERT architecture, specifically fine-tuned for French news summarization tasks.
Text Generation Transformers French
C
mrm8488
540
14
Mt5 M2o English Crosssum
A multilingual summarization model based on the mT5 architecture, capable of summarizing text in multiple languages into English
Text Generation Transformers Supports Multiple Languages
M
csebuetnlp
16
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase