R

Roberta Small Japanese Luw Upos

Developed by KoichiYasuoka
A RoBERTa model pre-trained on Aozora Bunko texts for Japanese POS tagging and dependency parsing.
Downloads 1,545
Release Time : 3/2/2022

Model Overview

This model is a small Japanese model based on the RoBERTa architecture, specifically designed for POS tagging and dependency parsing tasks. Each long unit word is annotated with UPOS (Universal Part-of-Speech tags).

Model Features

Pre-trained on Aozora Bunko
The model is pre-trained on texts from Japan's Aozora Bunko, making it suitable for processing Japanese texts.
Long unit word tagging
Each long unit word is annotated with UPOS (Universal Part-of-Speech tags), providing more detailed analysis.
Small model
As a small model, it offers faster inference speed and lower resource requirements while maintaining performance.

Model Capabilities

Japanese text analysis
POS tagging
Dependency parsing

Use Cases

Natural Language Processing
Japanese text analysis
Analyze the POS structure and syntactic relationships of Japanese texts
Outputs POS tags and syntactic dependency relationships for each word
Japanese language teaching aid
Used for grammatical analysis and sentence structure understanding in Japanese learning
Helps learners understand Japanese grammatical structures
Featured Recommended AI Models
ยฉ 2025AIbase