Ro Bart 1024
R
Ro Bart 1024
Developed by Iulian277
This is a BART base model pre-trained from scratch with 140 million parameters, based on a 50GB Romanian text corpus.
Downloads 85
Release Time : 5/23/2023
Model Overview
The model is pre-trained for text corruption tasks and cannot be directly used for downstream tasks without fine-tuning.
Model Features
Large-scale pre-training
Trained on a 50GB Romanian text corpus, with strong language understanding capabilities.
Long sequence processing
Supports a maximum sequence length of 1024 during training, suitable for processing long texts.
Trained from scratch
The model is entirely pre-trained from scratch, not based on fine-tuning of existing models.
Model Capabilities
Pre-training for text corruption tasks
Romanian text processing
Use Cases
Natural Language Processing
Text repair
Can be used as a pre-trained base model for text repair tasks.
Text generation
After fine-tuning, it can be used for Romanian text generation tasks.
Featured Recommended AI Models
Š 2025AIbase