Bert Base Japanese Upos
BERT model pre-trained on Japanese Wikipedia text for POS tagging and dependency parsing
Sequence Labeling
Transformers Supports Multiple Languages#Japanese POS tagging#Dependency parsing#BERT fine-tuning

Downloads 40
Release Time : 3/2/2022
Model Overview
This is a BERT model specifically designed for Japanese text processing, capable of performing POS tagging (UPOS) and dependency parsing on Japanese text. The model is based on bert-base-japanese-char-extended pre-training and is suitable for Japanese language processing tasks.
Model Features
Specialized for Japanese
BERT model optimized specifically for Japanese text, accurately handling unique linguistic features of Japanese
UPOS tagging
Supports Universal Part-of-Speech (UPOS) tag system, providing standardized POS tagging
Dependency parsing
Not only performs POS tagging but also analyzes dependency relationships between words in sentences
Trained on Wikipedia
Pre-trained using Japanese Wikipedia text, providing broad domain coverage
Model Capabilities
Japanese POS tagging
Japanese dependency parsing
Japanese text processing
Use Cases
Natural Language Processing
Japanese text analysis
Analyzing grammatical structure and POS of Japanese text
Accurately tags each word's POS and dependency relationships in sentences
Japanese learning aid
Assisting Japanese learners in understanding sentence structure and word usage
Provides detailed grammatical analysis results
Featured Recommended AI Models
ยฉ 2025AIbase