Ro Bart 512
R
Ro Bart 512
Developed by Iulian277
This is a BART base model pre-trained from scratch, specifically designed for Romanian text processing tasks.
Downloads 27
Release Time : 4/13/2023
Model Overview
The model is pre-trained on a 50GB Romanian text corpus, primarily for text corruption tasks, and cannot be directly used for downstream tasks without fine-tuning.
Model Features
Large-scale pre-training
Trained on a 50GB Romanian text corpus, with strong language understanding capabilities
Long sequence processing
Supports sequences up to 512 tokens in length
Specialized optimization
Specifically optimized for Romanian language
Model Capabilities
Text corruption processing
Romanian text understanding
Use Cases
Text preprocessing
Text corruption repair
Can be used to repair corrupted Romanian text
Featured Recommended AI Models
Š 2025AIbase