J

Javanese Distilbert Small

Developed by w11wo
A Javanese masked language model based on DistilBERT, trained on Javanese Wikipedia
Downloads 22
Release Time : 3/2/2022

Model Overview

This model is a Javanese masked language model based on the DistilBERT architecture, primarily used for Javanese text understanding and generation tasks.

Model Features

Efficient and Lightweight
Based on the DistilBERT architecture with only 66M parameters, making it more lightweight and efficient compared to the full BERT model.
Javanese Optimized
Specially trained and optimized for Javanese, making it suitable for Javanese text processing.
Wikipedia Trained
Trained on the latest Javanese Wikipedia articles, covering a wide range of topics.

Model Capabilities

Masked Language Prediction
Javanese Text Understanding
Javanese Text Generation

Use Cases

Natural Language Processing
Text Completion
Predict masked words in sentences
Example: 'Joko [MASK] wis kelas siji SMA.' can predict suitable words
Feature Extraction
Extract semantic features from Javanese text
Can be used for downstream NLP tasks such as classification or clustering
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase