A

Aragpt2 Base

Developed by aubmindlab
AraGPT2 is a Transformer-based Arabic text generation pre-trained model developed by AUB MIND Lab, supporting multiple model variants of different sizes.
Downloads 21.26k
Release Time : 3/2/2022

Model Overview

AraGPT2 is a GPT-2 model series specifically optimized for Arabic text generation tasks, including base, medium, large, and mega variants, supporting training and fine-tuning on both GPU and TPU.

Model Features

Multi-scale Models
Offers four model variants ranging from 135M parameters (base) to 1.46B parameters (mega) to meet different computational needs
Arabic Optimization
Specifically optimized for Arabic language characteristics, trained on 77GB of high-quality Arabic corpus
TPU/GPU Support
Supports training and fine-tuning on both GPU and TPU via TPUEstimator API
Transformers Compatibility
Base and medium versions are fully compatible with HuggingFace Transformers library, while large and mega versions can be adapted with wrapper classes

Model Capabilities

Arabic Text Generation
Text Auto-completion
Language Model Fine-tuning

Use Cases

Content Generation
News Writing Assistance
Generates news article snippets based on prompts
Produces coherent text adhering to Arabic grammar and style
Story Creation
Generates complete stories from opening prompts
Maintains narrative coherence and cultural relevance
Educational Applications
Language Learning
Generates Arabic learning materials and exercises
Provides customized content aligned with learning objectives
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase