# Enterprise NLP
Line Distilbert Base Japanese
Apache-2.0
DistilBERT model pre-trained on 131GB of Japanese web text, developed by LINE Corporation
Large Language Model
Transformers Japanese

L
line-corporation
12.92k
45
Vinilm 2021 From Large
Apache-2.0
A compact pre-trained language model distilled by VMware using MiniLMv2 technology from the vBERT-2021-large model, improving inference speed while maintaining performance
Large Language Model
Transformers English

V
VMware
23
2
Vbert 2021 Large
Apache-2.0
Customized BERT model developed by VMware, optimized for technical documents and proprietary terminology
Large Language Model
Transformers English

V
VMware
14
3
Featured Recommended AI Models