B

Bertin Base Random

Developed by bertin-project
This is a model based on the RoBERTa-base architecture, fully trained from scratch using Spanish data, specializing in masked language modeling tasks.
Downloads 19
Release Time : 3/2/2022

Model Overview

This model is a Spanish pretrained model based on the RoBERTa architecture, specifically designed for masked language modeling tasks in Spanish text.

Model Features

Pure Spanish Training
The model is fully trained from scratch using Spanish data, optimized for Spanish text.
Based on RoBERTa Architecture
Utilizes the RoBERTa-base architecture, offering robust text representation capabilities.
Large-scale Training Data
The training dataset is sourced from mc4, containing approximately 50 million Spanish documents.

Model Capabilities

Spanish Text Understanding
Masked Language Prediction

Use Cases

Natural Language Processing
Text Completion
Predict missing words or phrases in sentences
Language Model Fine-tuning
Serves as a base model for fine-tuning downstream NLP tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase