AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Masked language understanding

# Masked language understanding

Rumodernbert Base
Apache-2.0
A modern bidirectional encoder-only Transformer model for Russian, pre-trained on approximately 2 trillion Russian, English, and code tokens, with a context length of up to 8,192 tokens.
Large Language Model Transformers Supports Multiple Languages
R
deepvk
2,992
40
Scandibert
A BERT model supporting Icelandic, Danish, Swedish, Norwegian, and Faroese, with excellent performance on the ScandEval leaderboard
Large Language Model Transformers Other
S
vesteinn
122
4
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase