R

Rubert Base Cased

Developed by DeepPavlov
RuBERT is a Russian BERT model trained on Russian Wikipedia and news data, with 180 million parameters, supporting masked language modeling and next sentence prediction tasks.
Downloads 275.78k
Release Time : 3/2/2022

Model Overview

RuBERT is a BERT model optimized for Russian, suitable for various Russian natural language processing tasks such as text classification, named entity recognition, and question answering systems.

Model Features

Russian Optimization
Specifically trained for Russian language characteristics, using Russian Wikipedia and news data to build the vocabulary
Multi-task Support
Supports both masked language modeling and next sentence prediction pre-training tasks
Large-scale Pre-training
Based on a large-scale Transformer architecture with 180 million parameters

Model Capabilities

Russian Text Understanding
Text Feature Extraction
Semantic Similarity Calculation
Text Classification
Named Entity Recognition

Use Cases

Text Processing
Russian Text Classification
Classify Russian news or documents
Russian Question Answering System
Build automatic question answering applications based on Russian
Information Extraction
Russian Named Entity Recognition
Extract entity information such as person names and locations from Russian text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase