# Japanese Text Processing
Sbert Jsnli Luke Japanese Base Lite
Apache-2.0
This is a Japanese sentence embedding model based on sentence-transformers, capable of mapping sentences and paragraphs into a 768-dimensional vector space, suitable for tasks like clustering and semantic search.
Text Embedding
Transformers Japanese

S
oshizo
9,113
35
Bert Base Japanese Char
A BERT model pretrained on Japanese text using character-level tokenization, suitable for Japanese natural language processing tasks.
Large Language Model Japanese
B
tohoku-nlp
116.10k
8
Sentence Bert Base Ja Mean Tokens
This is a Japanese-specific sentence embedding model based on the BERT architecture, designed to generate semantic vector representations of sentences and calculate sentence similarity.
Text Embedding Japanese
S
sonoisa
51.01k
9
Roberta Base Japanese Aozora
Japanese RoBERTa model pre-trained on Aozora Bunko texts, supporting masked language modeling tasks
Large Language Model
Transformers Japanese

R
KoichiYasuoka
17
0
Bert Base Ja Cased
Apache-2.0
A customized Japanese slim version based on bert-base-multilingual-cased, maintaining original accuracy
Large Language Model Japanese
B
Geotrend
13
0
Featured Recommended AI Models