Distilbert Base Ru Cased
This is a compact version of the multilingual distilled BERT base model (case-sensitive), specifically optimized for Russian, capable of generating semantic representations identical to the original model while maintaining its accuracy.
Downloads 498
Release Time : 3/2/2022
Model Overview
This model is a Russian distilled BERT trained on Wikipedia datasets, preserving the semantic representation capabilities of the original BERT model while reducing its size.
Model Features
Consistent Semantic Representation
Capable of generating semantic representations identical to the original model, maintaining its accuracy.
Compact Model
Optimized for Russian, this version reduces the model size compared to the original multilingual model.
Multilingual Support
Supports generating compact versions of other multilingual Transformers.
Model Capabilities
Text Understanding
Semantic Representation
Russian Text Processing
Use Cases
Natural Language Processing
Russian Text Classification
Can be used for classification tasks involving Russian text.
Semantic Similarity Calculation
Calculates semantic similarity between Russian texts.
Featured Recommended AI Models