M

Mlm Spanish Roberta Base

Developed by MMG
Spanish pretrained language model based on RoBERTa architecture, focusing on masked language modeling tasks
Downloads 21
Release Time : 3/2/2022

Model Overview

This model adopts the RoBERTa base architecture, specifically optimized and trained for Spanish, suitable for various natural language processing tasks

Model Features

High-performance Spanish processing
Excellent performance on the GLUES benchmark, particularly outstanding in part-of-speech tagging and document classification tasks
Large-scale training data
Trained on 3.6GB of raw Spanish text
Robust architecture
Adopts the proven RoBERTa base architecture to ensure model stability and performance

Model Capabilities

Masked language modeling
Text classification
Named entity recognition
Part-of-speech tagging
Dependency parsing

Use Cases

Natural language understanding
Text classification
Classify Spanish documents
Achieved 93.00% accuracy on the GLUES benchmark
Named entity recognition
Identify named entities in Spanish text
F1 score reached 85.34
Language analysis
Part-of-speech tagging
Tag parts of speech for words in Spanish text
Accuracy reached 97.49%
Dependency parsing
Analyze the grammatical structure of Spanish sentences
UAS/LAS reached 85.14/81.08 respectively
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase