A

Albert Fa Base V2 Clf Digimag

Developed by m3hrdadfi
The first lightweight ALBERT model for Persian language, trained based on Google ALBERT BASE 2.0 version
Downloads 14
Release Time : 3/2/2022

Model Overview

A lightweight BERT model for self-supervised language representation learning in Persian, suitable for natural language processing tasks such as text classification

Model Features

Lightweight design
Compared to standard BERT models, ALBERT employs optimization techniques like parameter sharing to significantly reduce model size
Persian language optimization
Specifically trained for Persian language, covering various genres including science, fiction, and news
Large-scale training data
Trained using 3.9 million documents, 73 million sentences, and 1.3 billion words

Model Capabilities

Persian text understanding
Text classification
Language representation learning

Use Cases

News classification
DigiMag news classification
7-category classification of articles from Persian news website Digikala
F1 score reached 92.33
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase