A

Albert Fa Base V2

Developed by m3hrdadfi
A lightweight BERT model for self-supervised learning of Persian language representations
Downloads 43
Release Time : 3/2/2022

Model Overview

ALBERT-Persian is trained on a massive public corpus, primarily for fine-tuning downstream tasks such as sentiment analysis, text classification, and named entity recognition.

Model Features

Lightweight Design
Based on the ALBERT architecture, it is more lightweight compared to standard BERT models.
Diverse Training Data
Trained on various Persian data sources including Wikipedia, news, science, and lifestyle content.
Downstream Task Adaptation
Particularly suitable for fine-tuning downstream tasks like sentiment analysis, text classification, and named entity recognition.

Model Capabilities

Persian Text Understanding
Masked Language Modeling
Next Sentence Prediction
Sentiment Analysis
Text Classification
Named Entity Recognition

Use Cases

Sentiment Analysis
Digikala Review Sentiment Analysis
Analyze sentiment tendencies in user reviews on the e-commerce platform Digikala
F1 score 81.12
Snappfood Review Sentiment Analysis
Analyze sentiment tendencies in user reviews on the food delivery platform Snappfood
F1 score 85.79
Text Classification
Digikala Magazine Classification
Classify content in Digikala digital magazines
Accuracy 92.33
Persian News Classification
Classify Persian news content
Accuracy 97.01
Named Entity Recognition
Basic NER
Identify named entities in Persian text
PEYMA dataset F1 score 88.99
ARMAN Dataset NER
Named entity recognition on the ARMAN dataset
F1 score 97.43
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase