Arabertmo Base V3
AraBERTMo is an Arabic pre-trained language model based on Google's BERT architecture, supporting fill-mask tasks.
Downloads 15
Release Time : 3/2/2022
Model Overview
This model is an Arabic pre-trained model based on the BERT architecture, primarily used for Arabic fill-mask tasks.
Model Features
Arabic Optimization
Specifically pre-trained and optimized for Arabic
BERT Architecture
Based on Google's BERT-base architecture with strong language understanding capabilities
Large-scale Pre-training
Pre-trained using approximately 3 million words from the OSCAR Arabic corpus
Model Capabilities
Arabic text understanding
Fill-mask prediction
Use Cases
Natural Language Processing
Arabic Text Completion
Automatically completes missing parts in Arabic text
Arabic Grammar Check
Helps identify and correct grammatical errors in Arabic text
Featured Recommended AI Models