# IPA Dictionary Tokenization
Bert Base Japanese
A BERT model pretrained on Japanese Wikipedia text, using IPA dictionary for word-level tokenization, suitable for Japanese natural language processing tasks.
Large Language Model Japanese
B
tohoku-nlp
153.44k
38
Bert Base Japanese Whole Word Masking
BERT model pretrained on Japanese text using IPA dictionary tokenization and whole word masking techniques
Large Language Model Japanese
B
tohoku-nlp
113.33k
65
Featured Recommended AI Models