R

Roberta Base Bne

Developed by PlanTL-GOB-ES
Spanish masked language model based on RoBERTa architecture, trained on 570GB of cleaned text from the Spanish National Library
Downloads 28.76k
Release Time : 3/2/2022

Model Overview

This model is specifically designed for Spanish, suitable for masked language modeling tasks and fine-tuning for downstream NLP tasks

Model Features

Large-Scale Corpus Training
Trained on 570GB of high-quality Spanish text from the Spanish National Library between 2009-2019
Domain-Specific Adaptation
Optimized for Spanish language characteristics, including understanding of local cultural contexts
Multi-Task Support
Can be directly used for masked prediction or fine-tuned for various NLP tasks

Model Capabilities

Masked Word Prediction
Text Feature Extraction
Named Entity Recognition
Text Classification
Question Answering Systems

Use Cases

Text Understanding
Semantic Completion
Automatically completes sentences containing masked tokens
Examples show accurate prediction of contextually appropriate vocabulary
EdTech
Language Learning Assistance
Generates Spanish learning materials
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase