M

Modernbert Large Japanese Aozora

Developed by KoichiYasuoka
This is a ModernBERT model pre-trained on Aozora Bunko texts, specifically designed for Japanese text processing.
Downloads 16
Release Time : 1/7/2025

Model Overview

This model is a ModernBERT pre-trained on Aozora Bunko texts, suitable for masked language model tasks in Japanese and can be fine-tuned for downstream tasks.

Model Features

Pre-trained on Aozora Bunko
The model is pre-trained on the clean dataset from Aozora Bunko, making it suitable for Japanese text processing.
Supports Downstream Task Fine-tuning
This model can be fine-tuned for downstream tasks such as part-of-speech tagging and dependency parsing.
High-Performance Training
Trained using NVIDIA A100-SXM4-40GB×8 GPUs, taking 10 hours and 5 minutes.

Model Capabilities

Japanese Text Processing
Masked Language Model
Part-of-Speech Tagging
Dependency Parsing

Use Cases

Natural Language Processing
Japanese Text Mask Filling
Used to fill masked parts in Japanese text, e.g., 'After arriving in Japan, be sure to visit [MASK].'
Part-of-Speech Tagging
Can be used for part-of-speech tagging tasks in Japanese text.
Dependency Parsing
Can be used for dependency parsing tasks in Japanese text.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase