# Indonesian Pretrained
Indot5 Small
A T5-small model pretrained on the Indonesian mC4 dataset, requires fine-tuning before use
Large Language Model
Transformers Other

I
Wikidepia
83
0
Bert Base Indonesian 1.5G
MIT
This is a BERT-based Indonesian pretrained model trained on Wikipedia and newspaper data, suitable for various natural language processing tasks.
Large Language Model Other
B
cahya
40.08k
5
Indot5 Base
T5 (Text-to-Text Transfer Transformer) base model pretrained on Indonesian mC4 dataset, requires fine-tuning for use
Large Language Model
Transformers Other

I
Wikidepia
635
1
Featured Recommended AI Models