A

Arabertmo Base V6

Developed by Ebtihal
AraBERTMo is an Arabic pre-trained language model based on the BERT architecture, supporting fill-mask tasks.
Downloads 14
Release Time : 3/2/2022

Model Overview

AraBERTMo is an Arabic pre-trained language model based on Google's BERT architecture, primarily used for Arabic fill-mask tasks. The model is pre-trained on the OSCAR Arabic corpus and is suitable for various Arabic natural language processing tasks.

Model Features

Arabic optimization
Specifically pre-trained and optimized for Arabic, enabling better handling of Arabic text.
BERT architecture
Based on the proven BERT-Base architecture, ensuring reliable performance.
Multiple variants
Offers 10 different model variants to suit various needs.

Model Capabilities

Arabic text understanding
Fill-mask prediction
Arabic natural language processing

Use Cases

Natural language processing
Arabic text completion
Predicts masked words in Arabic sentences.
Arabic question-answering systems
Serves as a foundational model for Arabic question-answering systems.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase