AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Small RoBERTa

# Small RoBERTa

Codeberta Small V1
CodeBERTa is a code understanding model based on the RoBERTa architecture, specifically trained for multiple programming languages, capable of efficiently handling code-related tasks.
Large Language Model Transformers Other
C
claudios
16
1
Roberta Ko Small Tsdae
MIT
This is a Korean small RoBERTa model based on sentence-transformers, capable of mapping sentences and paragraphs into a 256-dimensional dense vector space, suitable for tasks such as clustering or semantic search.
Text Embedding Transformers Korean
R
smartmind
39
2
Roberta Small Greek
This is a small Greek language model based on the RoBERTa architecture, with a parameter scale approximately half of the base model. It is suitable for the masked filling task of Greek text.
Large Language Model Transformers Other
R
ClassCat
22
2
Roberta Small Belarusian
This is a RoBERTa model pretrained on the CC-100 dataset, suitable for Belarusian text processing tasks.
Large Language Model Transformers Other
R
KoichiYasuoka
234
5
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase