J

Javanese Bert Small

Developed by w11wo
A Javanese masked language model based on the BERT architecture, fine-tuned on Javanese Wikipedia data
Downloads 22
Release Time : 3/2/2022

Model Overview

This model is a small version of BERT specifically optimized for Javanese, primarily used for Javanese masked language modeling and text feature extraction tasks.

Model Features

Javanese Optimization
Specially fine-tuned for Javanese, enabling better processing of Javanese text
Lightweight Architecture
Uses the BERT Small architecture with a moderate parameter size (110M), requiring lower computational resources
Wikipedia Training
Trained on the latest Javanese Wikipedia articles, covering a wide range of topics

Model Capabilities

Masked Language Prediction
Text Feature Extraction
Javanese Text Understanding

Use Cases

Natural Language Processing
Javanese Text Completion
Predict masked Javanese vocabulary
Example input: 'Aku mangan sate ing [MASK] bareng konco-konco'
Text Feature Extraction
Extract Javanese text features for downstream NLP tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase