AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Multi-dataset Pretraining

# Multi-dataset Pretraining

Sundanese Roberta Base
MIT
A Sundanese masked language model based on the RoBERTa architecture, trained on multiple datasets.
Large Language Model Other
S
w11wo
32
2
Pegasus Billsum
PEGASUS is an abstractive summarization pre-trained model based on gap sentence extraction, focused on generating high-quality text summaries.
Text Generation Transformers English
P
google
295
4
Pegasus Aeslc
PEGASUS is a pretrained model based on gap sentence extraction, specifically designed for abstractive text summarization tasks.
Text Generation Transformers English
P
google
21
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase