# Sequence-to-sequence
Mt5 Small Finetuned Gazeta Ru
Apache-2.0
A Russian abstract generation model fine-tuned on the gazeta dataset based on google/mt5-small
Text Generation
TensorBoard

M
sansmislom
33
0
Darija To English
Apache-2.0
A fine-tuned Arabic dialect Latin script to English translation model based on Helsinki-NLP/opus-mt-ar-en
Machine Translation
Transformers

D
centino00
34
6
Inclusively Reformulation It5
An Italian sequence-to-sequence model fine-tuned on IT5-large, specifically designed for inclusive language rewriting tasks
Machine Translation
Transformers

I
E-MIMIC
70
1
Kot5 Base
KoT5 is a Korean-English bilingual text generation model based on the T5 architecture, supporting tasks such as summarization and paraphrase generation.
Large Language Model Supports Multiple Languages
K
wisenut-nlp-team
296
1
Autotrain Fr En Translate 51410121895
This is a French-to-English text translation model based on Transformers, trained using AutoTrain.
Machine Translation
Transformers Supports Multiple Languages

A
ybanas
24
3
Banglat5 Small
A Bengali pre-trained model based on the sequence-to-sequence Transformer architecture, optimized for natural language generation tasks
Large Language Model
Transformers Other

B
csebuetnlp
510
2
Banglat5
BanglaT5 is a Bengali sequence-to-sequence transformer model pre-trained with Span Corruption objectives, achieving state-of-the-art performance in multiple Bengali natural language generation tasks.
Large Language Model
Transformers Other

B
csebuetnlp
1,102
15
It5 Large News Summarization
Apache-2.0
Italian news summarization model fine-tuned on Fanpage and Il Post corpora based on the IT5 large model
Text Generation Other
I
gsarti
47
1
It5 Large
Apache-2.0
IT5 is the first family of sequence-to-sequence Transformer models specifically pretrained at scale for Italian, following the T5 model approach.
Large Language Model Other
I
gsarti
37
1
Arabic T5 Small
Arabic language model trained on T5v1.1 small architecture, incorporating multiple Arabic datasets for training
Large Language Model Arabic
A
flax-community
279
10
It5 Small
Apache-2.0
IT5 is the first family of sequence-to-sequence Transformer models pretrained at scale for Italian, following the approach of the original T5 model.
Large Language Model Other
I
gsarti
220
2
It5 Base
Apache-2.0
IT5 is the first attempt at large-scale sequence-to-sequence Transformer model pretraining specifically for Italian, based on the T5 model architecture.
Large Language Model Other
I
gsarti
389
24
It5 Base Oscar
Apache-2.0
The first large-scale sequence-to-sequence Transformer model pre-trained specifically for Italian, based on the OSCAR corpus
Large Language Model Other
I
gsarti
19
0
Bart Squadv2
This is a bart-large model fine-tuned on the SQuADv2 dataset for question-answering tasks, based on the BART architecture, suitable for natural language understanding and generation tasks.
Question Answering System
Transformers

B
aware-ai
96
1
Wmt21 Dense 24 Wide X En
MIT
A 4.7 billion parameter multilingual encoder-decoder model supporting translation from 7 languages to English
Machine Translation
Transformers Supports Multiple Languages

W
facebook
17
13
M2m100 418M
MIT
M2M100 is a multilingual encoder-decoder model supporting 9,900 translation directions across 100 languages
Machine Translation Supports Multiple Languages
M
facebook
1.6M
299
It5 Small Wiki Summarization
Apache-2.0
IT5 Small model fine-tuned on the WITS dataset, specifically designed for summarization tasks of Italian Wikipedia content.
Text Generation Other
I
gsarti
32
0
Featured Recommended AI Models