R

Roberta Ko Small

Developed by lassl
A compact Korean RoBERTa model trained under the LASSL framework, suitable for various Korean natural language processing tasks.
Downloads 17
Release Time : 3/2/2022

Model Overview

This model is a pre-trained Korean RoBERTa model optimized for Korean natural language processing tasks, supporting tasks like masked language modeling.

Model Features

Korean Optimization
Specially optimized for Korean, suitable for Korean natural language processing tasks.
Multi-task Support
Supports various natural language processing tasks, including sentiment analysis, natural language inference, semantic similarity, question answering, and machine reading comprehension.
Efficient Training
Trained under the LASSL framework with rich training corpus covering multiple Korean language datasets.

Model Capabilities

Sentiment Analysis
Natural Language Inference
Semantic Similarity Calculation
Question Answering System
Machine Reading Comprehension
Masked Language Modeling

Use Cases

Sentiment Analysis
Movie Review Sentiment Analysis
Analyze the sentiment tendency (positive/negative) of Korean movie reviews.
Achieved an accuracy of 87.8846 in the NSMC sentiment analysis task.
Question Answering System
Korean Question Answering
Answer question-answering tasks based on Korean text.
Achieved an accuracy of 83.1780 in the KorQuAD question-answering task.
Semantic Similarity
Sentence Similarity Calculation
Calculate the semantic similarity between two Korean sentences.
Achieved an accuracy of 83.8353 in the KLUE semantic similarity task.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase