R

Rubert Base

Developed by ai-forever
Pre-trained Russian base BERT model by SberDevices team, with 178 million parameters, trained on 30GB Russian text
Downloads 29.43k
Release Time : 3/2/2022

Model Overview

Russian pre-trained Transformer language model, mainly used for masked filling tasks

Model Features

Russian language optimization
Specially pre-trained for Russian language characteristics, excelling in Russian NLP tasks
Efficient architecture
Based on BERT base architecture, with moderate parameter count and high inference efficiency
Large-scale pre-training
Pre-trained on 30GB Russian text data with strong language understanding capabilities

Model Capabilities

Text understanding
Mask filling
Context feature extraction

Use Cases

Natural Language Processing
Text auto-completion
Predict masked words or phrases in text
Text feature extraction
Provide text representations for downstream NLP tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase