A

Arabertmo Base V2

Developed by Ebtihal
Arabic pre-trained language model based on BERT architecture, supporting masked language modeling tasks
Downloads 17
Release Time : 3/2/2022

Model Overview

AraBERTMo is an Arabic pre-trained language model based on Google's BERT architecture, primarily used for Arabic text masked language modeling tasks.

Model Features

Arabic Language Optimization
Specifically pre-trained and optimized for Arabic text
BERT Architecture
Based on Google's BERT-base architecture with powerful language understanding capabilities
Multi-version Support
Provides 10 different model variants to accommodate various needs

Model Capabilities

Arabic text understanding
Masked language modeling prediction
Contextual semantic analysis

Use Cases

Text Processing
Arabic Text Completion
Automatically completes missing parts in Arabic text
Arabic Grammar Checking
Identifies and corrects grammatical errors in Arabic text
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase