Bert Large Japanese Wikipedia Ud Head Finetuned Inquiry
BERT-large model pre-trained on Japanese Wikipedia data, fine-tuned for UD head parsing tasks
Downloads 33
Release Time : 12/1/2022
Model Overview
This model is a fine-tuned version of bert-large-japanese-wikipedia-ud-head for unknown datasets, primarily used for Japanese text processing tasks
Model Features
Japanese-specific Model
BERT model optimized specifically for Japanese text
UD Head Parsing Fine-tuning
Fine-tuned on Universal Dependencies (UD) head parsing tasks
Large-scale Pre-training
Pre-trained on Japanese Wikipedia data
Model Capabilities
Japanese Text Understanding
Q&A System
Text Classification
Dependency Relation Analysis
Use Cases
Business Consultation
Customer Inquiry Processing
Automatically process Japanese customer inquiry emails
Content Analysis
Japanese Text Parsing
Analyze grammatical structure and semantic relationships in Japanese text
Featured Recommended AI Models
Š 2025AIbase