I

Indo Roberta Small

Developed by w11wo
Indonesian Small RoBERTa is a masked language model based on the RoBERTa architecture, specifically trained for Indonesian language, suitable for text infilling and feature extraction tasks.
Downloads 50
Release Time : 3/2/2022

Model Overview

This model is an Indonesian masked language model based on the RoBERTa architecture, trained on Indonesian Wikipedia data, primarily used for text infilling and feature extraction.

Model Features

Indonesian Language Optimization
Specifically trained for Indonesian language, suitable for handling Indonesian text tasks.
Lightweight Model
Only 84M parameters, suitable for deployment in resource-limited environments.
Based on RoBERTa Architecture
Utilizes the powerful RoBERTa architecture to provide excellent language understanding capabilities.

Model Capabilities

Text Infilling
Feature Extraction
Indonesian Text Processing

Use Cases

Text Processing
Text Infilling
Fill in missing parts of sentences, such as 'Budi is at school <mask>.'
Feature Extraction
Extract semantic features from text for downstream tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase