AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Japanese Character-level BERT

# Japanese Character-level BERT

Bert Base Japanese Char Whole Word Masking
A BERT model pre-trained on Japanese text using character-level tokenization and whole word masking techniques, suitable for Japanese natural language processing tasks.
Large Language Model Japanese
B
tohoku-nlp
1,724
4
Bert Base Japanese Char V2
BERT model pre-trained on Japanese text using character-level tokenization and whole word masking, trained on the Japanese Wikipedia version as of August 31, 2020
Large Language Model Japanese
B
tohoku-nlp
134.28k
6
Bert Base Japanese Basic Char V2
This is a Japanese BERT pre-trained model based on character-level tokenization and whole word masking techniques, requiring no dependency on `fugashi` or `unidic_lite` toolkits.
Large Language Model Transformers Japanese
B
hiroshi-matsuda-rit
14
0
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase