# Monolingual corpus
Sinbert Large
MIT
SinBERT is a Sinhala pre-trained language model based on the RoBERTa architecture, trained on a large Sinhala monolingual corpus (sin-cc-15M).
Large Language Model
Transformers Other

S
NLPC-UOM
150
6
Sinbert Small
MIT
SinBERT is a model pretrained on a large Sinhala monolingual corpus (sin-cc-15M) based on the RoBERTa architecture, suitable for Sinhala text processing tasks.
Large Language Model
Transformers Other

S
NLPC-UOM
126
4
Featured Recommended AI Models