D

Distilbert Base Indonesian

Developed by cahya
This is a distilled version of the Indonesian BERT base model, specifically designed for Indonesian language processing in a case-insensitive format.
Downloads 1,815
Release Time : 3/2/2022

Model Overview

This model is a pre-trained language model based on Indonesian datasets, suitable for downstream tasks such as text classification and text generation.

Model Features

Distilled model
A distilled version of the Indonesian BERT base model that retains most of the performance while being more lightweight.
Case-insensitive processing
All input text is converted to lowercase, simplifying text preprocessing steps.
Indonesian language optimization
Pre-trained specifically for Indonesian, making it suitable for Indonesian text processing tasks.

Model Capabilities

Masked language modeling
Text feature extraction
Text classification
Text generation

Use Cases

Text processing
Fill-mask
Predict masked words in sentences
As shown in examples, it can accurately predict appropriate words in Indonesian contexts
Text feature extraction
Obtain vector representations of text
Can be used for downstream tasks like classification or similarity calculation
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase