A

Arabictransformer Base

Developed by sultan
An efficient Arabic language model based on Funnel Transformer and ELECTRA objective, with low computational cost and superior performance
Downloads 17
Release Time : 3/2/2022

Model Overview

This model adopts the Funnel Transformer architecture and ELECTRA objective, pretrained on a 44GB Arabic corpus, significantly reducing computational costs while maintaining high performance, suitable for various Arabic NLP tasks.

Model Features

Efficient Computation
Compresses hidden state sequences through Funnel Transformer, significantly reducing pretraining computational costs
ELECTRA Objective
Adopts ELECTRA training objective to improve model efficiency and data utilization
High Performance
Achieves or approaches state-of-the-art levels on multiple Arabic downstream tasks
Resource Optimization
Pretraining resource consumption is significantly lower than comparable advanced models

Model Capabilities

Text Classification
Question Answering System
Arabic Language Understanding

Use Cases

Natural Language Processing
Arabic Question Answering System
Application on the TyDi QA Arabic dataset
EM 74.70, F1 85.89
Text Classification
Arabic text classification tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase