J

Japanese Roberta Base

Developed by rinna
A base-sized Japanese RoBERTa model trained by rinna Co., Ltd., suitable for masked language modeling tasks in Japanese text.
Downloads 9,375
Release Time : 3/2/2022

Model Overview

This is a Japanese pretrained language model based on the RoBERTa architecture, primarily used for masked word prediction tasks in Japanese text.

Model Features

Japanese-specific Pretraining
Specifically pretrained for Japanese text, optimized for Japanese linguistic characteristics
Based on RoBERTa Architecture
Utilizes an improved BERT architecture, removing the next sentence prediction task and trained with larger batches and more data
SentencePiece Tokenization
Uses a SentencePiece-based tokenizer trained on Japanese Wikipedia

Model Capabilities

Masked word prediction
Japanese text understanding
Contextual semantic analysis

Use Cases

Text Completion
Japanese Text Masked Word Prediction
Predicts masked Japanese vocabulary
Accurately predicted words like 'オリンピック' in examples
Language Model Fine-tuning
Downstream NLP Tasks
Can be used as a base model for fine-tuning various Japanese NLP tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase