Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Miniature Distilled BERT
# Miniature Distilled BERT
Rubert Tiny
MIT
An extremely compact distilled version (45MB, 12M parameters) of the bert-base-multilingual-cased model for Russian and English, prioritizing speed and size over absolute accuracy
Large Language Model
Transformers
Supports Multiple Languages
R
cointegrated
36.18k
41
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase