R

Ro Bart Large 512

Developed by Iulian277
This is a BART large model pretrained from scratch with 400 million parameters, specifically designed for Romanian language.
Downloads 141
Release Time : 9/4/2023

Model Overview

The model is pretrained on text corruption tasks and requires fine-tuning for downstream applications.

Model Features

Large-scale Romanian pretraining
Trained on a 50GB cleaned Romanian text corpus
Long sequence processing capability
Supports processing sequences up to 512 tokens in length
Deep pretraining
Trained for 3 million steps to ensure thorough language feature learning

Model Capabilities

Romanian text understanding
Text corruption task processing

Use Cases

Natural Language Processing
Text summarization
Can be fine-tuned for Romanian text summarization tasks
Machine translation
Can serve as a base model for Romanian-related translation models
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase