B

Bert Large Japanese Char Extended

Developed by KoichiYasuoka
This is a BERT model pre-trained on Japanese Wikipedia text, derived from bert-large-japanese-char, with enhanced character embedding capabilities to support more Kanji characters.
Downloads 18
Release Time : 3/2/2022

Model Overview

This model is a Japanese BERT model that enhances Kanji processing capabilities through extended character embeddings, suitable for various Japanese natural language processing tasks.

Model Features

Extended character embeddings
Enhanced character embedding functionality supports all commonly used Kanji/Jinmeiyo Kanji characters
Wikipedia-based pre-training
Pre-trained using Japanese Wikipedia text, possessing rich linguistic knowledge
Suitable for downstream tasks
Can be fine-tuned for tasks such as part-of-speech tagging and dependency parsing

Model Capabilities

Japanese text understanding
Masked language modeling
Character-level processing

Use Cases

Natural Language Processing
Part-of-speech tagging
Can be used for part-of-speech tagging tasks in Japanese text
Dependency parsing
Can be used to analyze the grammatical structure of Japanese sentences
Text filling
Predicts words replaced by the [MASK] token
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase