# Sequence Generation
Question Decomposer T5
This is a sequence-to-sequence model based on T5-base, specifically designed for decomposing complex questions into multiple sub-questions.
Text Generation
Safetensors English
Q
thenHung
317
4
Flan T5 Base Samsum
Apache-2.0
This model is a fine-tuned version of google/flan-t5-base on the samsum dialogue summarization dataset, specifically designed for dialogue summarization tasks.
Text Generation
Transformers

F
sharmax-vikas
26
0
Indobart V2 Amr To Text Linearized Penman Ilmy Epochs 3 With Lemma And Upos And Voice
MIT
This model is a fine-tuned version based on indobenchmark/indobart-v2, designed for AMR-to-text conversion tasks.
Large Language Model
Transformers

I
abdiharyadi
46
1
Reastap Large Finetuned Wikisql
ReasTAP is a table reasoning-based pretrained model that injects table reasoning skills through synthetic reasoning examples and is fine-tuned on the WikiSQL dataset.
Question Answering System
Transformers English

R
Yale-LILY
27
1
Bart Large Xsum Finetuned Samsum V2
MIT
This model is a text summarization model fine-tuned on the samsum dataset based on facebook/bart-large-xsum, specializing in generating dialogue summaries.
Text Generation
Transformers

B
amagzari
48
1
Marian Finetuned Kde4 En To Ja
Apache-2.0
An English-to-Japanese translation model fine-tuned on the kde4 dataset, based on Helsinki-NLP/opus-tatoeba-en-ja
Machine Translation
Transformers

M
Hoax0930
43
0
It5 Efficient Small El32 News Summarization
Apache-2.0
Italian news summarization model based on IT5 Efficient Small EL32 architecture, fine-tuned on Fanpage and Il Post datasets
Text Generation Other
I
gsarti
97
4
Gpt2 Chinese Couplet
A Chinese couplet generation model based on the GPT2 architecture, pre-trained with the UER-py framework, capable of generating Chinese text that conforms to traditional couplet formats.
Text Generation Chinese
G
uer
491
10
Marian Finetuned Kde4 En To Zh TW
Apache-2.0
This model is a fine-tuned English-to-Traditional Chinese translation model based on Helsinki-NLP/opus-mt-en-zh on the kde4 dataset, achieving a Bleu score of 39.0863.
Machine Translation
Transformers

M
peterhsu
38
2
Tts Hubert Cluster Bart Base
Apache-2.0
A speech processing model based on HuBERT and BART architecture, supporting Automatic Speech Recognition (ASR) tasks
Speech Recognition
Transformers Supports Multiple Languages

T
voidful
24
1
Squad Mbart Model
An mbart model trained from scratch on an unknown dataset, specific uses and features require further details
Question Answering System
Transformers

S
ZYW
18
0
T5 Small Finetuned En To Ro Dataset 20 Input 64
Apache-2.0
English to Romanian translation model fine-tuned on the wmt16 dataset based on the T5-small model
Machine Translation
Transformers

T
aretw0
14
0
Sber Rut5 Filler
Apache-2.0
This is a text processing model supporting Russian, primarily used for text generation and text completion tasks.
Large Language Model
Transformers Other

S
IlyaGusev
19
3
Featured Recommended AI Models