# Multi-dataset Pretraining
Sundanese Roberta Base
MIT
A Sundanese masked language model based on the RoBERTa architecture, trained on multiple datasets.
Large Language Model Other
S
w11wo
32
2
Pegasus Billsum
PEGASUS is an abstractive summarization pre-trained model based on gap sentence extraction, focused on generating high-quality text summaries.
Text Generation
Transformers English

P
google
295
4
Pegasus Aeslc
PEGASUS is a pretrained model based on gap sentence extraction, specifically designed for abstractive text summarization tasks.
Text Generation
Transformers English

P
google
21
0
Featured Recommended AI Models