B

Bertin Base Stepwise

Developed by bertin-project
Spanish pre-trained model based on RoBERTa architecture, specialized in masked language modeling tasks
Downloads 15
Release Time : 3/2/2022

Model Overview

This is a Spanish RoBERTa model trained from scratch, specifically designed for masked language modeling tasks with Spanish text.

Model Features

Spanish language optimization
Specially trained and optimized for Spanish text
Data quality filtering
Training data underwent complexity filtering, removing extremely high or low complexity documents
Community-driven development
Developed as part of Flax/Jax community week activities with Google TPU resource support

Model Capabilities

Spanish text understanding
Masked language modeling prediction

Use Cases

Text processing
Text completion
Predicting masked words in sentences
Example: Predicting the <mask> portion in 'Fui a la librería a comprar un <mask>.'
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase