AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Monolingual corpus

# Monolingual corpus

Sinbert Large
MIT
SinBERT is a Sinhala pre-trained language model based on the RoBERTa architecture, trained on a large Sinhala monolingual corpus (sin-cc-15M).
Large Language Model Transformers Other
S
NLPC-UOM
150
6
Sinbert Small
MIT
SinBERT is a model pretrained on a large Sinhala monolingual corpus (sin-cc-15M) based on the RoBERTa architecture, suitable for Sinhala text processing tasks.
Large Language Model Transformers Other
S
NLPC-UOM
126
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase