Roberta Small
R
Roberta Small
Developed by klue
A compact Korean pre-trained RoBERTa model developed by the KLUE benchmark team
Downloads 3,362
Release Time : 3/2/2022
Model Overview
This is a compact version of RoBERTa optimized for Korean, suitable for various Korean natural language processing tasks
Model Features
Korean Optimization
Pre-trained and optimized specifically for Korean language characteristics
Compact Architecture
Fewer parameters compared to standard RoBERTa models, suitable for resource-constrained scenarios
BERT Tokenizer Compatibility
Uses BertTokenizer instead of RobertaTokenizer, maintaining compatibility with the BERT ecosystem
Model Capabilities
Korean Text Understanding
Korean Text Classification
Korean Question Answering Systems
Korean Named Entity Recognition
Use Cases
Natural Language Processing
Korean Text Classification
Sentiment analysis or topic classification of Korean texts
Korean Question Answering System
Building Korean-based question answering applications
Featured Recommended AI Models
Š 2025AIbase