R

Roberta Small Bulgarian

Developed by iarfmoose
This is a streamlined version of the Bulgarian RoBERTa model, containing only 6 hidden layers while maintaining comparable performance.
Downloads 21
Release Time : 3/2/2022

Model Overview

The model is suitable for Bulgarian cloze tasks (masked language modeling) or fine-tuning for other tasks.

Model Features

Streamlined Architecture
Contains only 6 hidden layers, making it more lightweight compared to the full version.
Bulgarian Language Optimization
Specially trained and optimized for the Bulgarian language.
Dynamic Masking Training
Utilizes RoBERTa's dynamic masking technique for pre-training.

Model Capabilities

Text completion
Language understanding
Text feature extraction

Use Cases

Natural Language Processing
Cloze Test
Predicts masked words in the text.
Text Classification
Can be fine-tuned for text classification tasks.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase