A

Albert Fa Base V2 Ner Arman

Developed by m3hrdadfi
A lightweight BERT model for self-supervised language representation learning in Persian
Downloads 22
Release Time : 3/2/2022

Model Overview

ALBERT-Persian is the first attempt to implement the ALBERT model in the Persian language domain, based on Google's ALBERT base v2.0 architecture and trained with multi-domain corpora.

Model Features

Lightweight Architecture
Adopts ALBERT architecture with fewer parameters and higher efficiency compared to traditional BERT models
Multi-domain Training
Trained with corpora from various domains including science, fiction, and news for broad coverage
High-performance NER
Excels in Persian named entity recognition tasks with an F1 score of 97.43

Model Capabilities

Persian text understanding
Named entity recognition
Language representation learning

Use Cases

Natural Language Processing
Persian Named Entity Recognition
Identify entities such as organizations, locations, and person names from Persian text
Achieved an F1 score of 97.43 on the ARMAN dataset
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase