# Wikipedia Corpus
Bert Base Indonesian 522M
MIT
A BERT base model pretrained on Indonesian Wikipedia using Masked Language Modeling (MLM) objective, case insensitive.
Large Language Model Other
B
cahya
2,799
25
Albert Large Arabic
Arabic pretrained version of ALBERT large model, trained on approximately 4.4 billion words of Arabic corpus
Large Language Model
Transformers Arabic

A
asafaya
45
1
Bert Base Arabic
Pretrained Arabic BERT base language model supporting Modern Standard Arabic and some dialects
Large Language Model Arabic
B
asafaya
14.40k
38
Albert Base Arabic
Arabic ALBERT Base is a pretrained language model trained on approximately 4.4 billion words of Arabic data, supporting Modern Standard Arabic and some dialectal content.
Large Language Model
Transformers Arabic

A
asafaya
35
0
Japanese Roberta Base
MIT
A base-sized Japanese RoBERTa model trained by rinna Co., Ltd., suitable for masked language modeling tasks in Japanese text.
Large Language Model
Transformers Japanese

J
rinna
9,375
37
Wangchanberta Base Wiki Newmm
A RoBERTa BASE model pretrained on Thai Wikipedia, suitable for Thai text processing tasks
Large Language Model Other
W
airesearch
115
2
Bert Base En Tr Cased
Apache-2.0
A streamlined version customized from bert-base-multilingual-cased, supporting English and Turkish processing while maintaining the original model's accuracy
Large Language Model Other
B
Geotrend
21
0
Featured Recommended AI Models