A

Albert Fa Base V2 Ner Peyma

Developed by m3hrdadfi
The first ALBERT model specifically for Persian, based on Google's ALBERT base v2.0 architecture, trained on diverse Persian corpora
Downloads 19
Release Time : 3/2/2022

Model Overview

A lightweight BERT model for self-supervised language representation learning in Persian, suitable for natural language processing tasks

Model Features

Lightweight Architecture
Based on ALBERT architecture with fewer parameters and higher efficiency compared to standard BERT models
Diverse Training Data
Trained on diverse Persian corpora comprising over 3.9 million documents, 73 million sentences, and 1.3 billion words
Named Entity Recognition Capability
Specially optimized for Persian named entity recognition tasks

Model Capabilities

Persian text understanding
Named Entity Recognition
Token classification

Use Cases

Natural Language Processing
Persian Named Entity Recognition
Identifying entities such as organizations, person names, and locations from Persian text
Achieved an F1 score of 88.99 on the PEYMA dataset
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase