A

Arabertmo Base V10

Developed by Ebtihal
AraBERTMo is an Arabic pre-trained language model based on Google's BERT architecture, supporting fill-mask tasks.
Downloads 39
Release Time : 3/4/2022

Model Overview

AraBERTMo is an Arabic pre-trained model based on the BERT architecture, primarily used for handling fill-mask tasks in Arabic text.

Model Features

Arabic Language Optimization
Specifically pre-trained and optimized for Arabic text
BERT Architecture
Adopts the same architecture and configuration as BERT-Base
Multi-Version Support
Offers 10 different variant models

Model Capabilities

Arabic Text Understanding
Fill-Mask Prediction

Use Cases

Text Processing
Arabic Text Completion
Automatically completes missing parts in Arabic text
Arabic Grammar Checking
Identifies and corrects grammatical errors in Arabic text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase