# Arabic Masked Language Modeling
Albert Xlarge Arabic
An Arabic version of the ALBERT Xlarge pretrained language model, trained on approximately 4.4 billion words, supporting Modern Standard Arabic and some dialectal content.
Large Language Model
Transformers Arabic

A
asafaya
64
1
Arabertmo Base V2
Arabic pre-trained language model based on BERT architecture, supporting masked language modeling tasks
Large Language Model
Transformers Arabic

A
Ebtihal
17
0
Arabertmo Base V4
AraBERTMo is an Arabic pre-trained language model based on BERT architecture, supporting masked language modeling tasks.
Large Language Model
Transformers Arabic

A
Ebtihal
15
0
Featured Recommended AI Models