# JGLUE Benchmark
Line Distilbert Base Japanese
Apache-2.0
DistilBERT model pre-trained on 131GB of Japanese web text, developed by LINE Corporation
Large Language Model
Transformers Japanese

L
line-corporation
12.92k
45
Luke Japanese Base Lite
Apache-2.0
The lightweight version of LUKE Japanese Base, a pre-trained knowledge-enhanced contextual word and entity representation model optimized for Japanese tasks.
Large Language Model
Transformers Japanese

L
studio-ousia
403
8
Roberta Large Japanese
A large Japanese RoBERTa model pretrained on Japanese Wikipedia and the Japanese portion of CC-100, suitable for Japanese natural language processing tasks.
Large Language Model
Transformers Japanese

R
nlp-waseda
227
23
Featured Recommended AI Models