T

Transformers Ud Japanese Electra Base Ginza 510

Developed by megagonlabs
Japanese pretrained model based on ELECTRA architecture, pretrained on approximately 200 million Japanese sentences from the mC4 dataset and fine-tuned on UD_Japanese_BCCWJ corpus
Downloads 7,757
Release Time : 3/2/2022

Model Overview

This is a Japanese natural language processing model based on the ELECTRA architecture, specifically optimized for Japanese text and capable of recognizing Japanese bunsetsu structures.

Model Features

Trained on large-scale Japanese corpus
Pretrained on approximately 200 million Japanese sentences from the mC4 dataset
Domain-specific fine-tuning
Fine-tuned on UD_Japanese_BCCWJ corpus to optimize Japanese dependency parsing capabilities
GiNZA integration
Can be used with GiNZA v5 to provide comprehensive Japanese NLP processing capabilities

Model Capabilities

Japanese text analysis
Dependency parsing
Bunsetsu structure recognition

Use Cases

Natural Language Processing
Japanese text parsing
Analyzing the structure and dependency relationships of Japanese sentences
Accurately identifies Japanese bunsetsu structures
Japanese NLP application development
Serving as a base model for Japanese NLP applications
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase