B

Bert Base Parsbert Ner Uncased

Developed by HooshvareLab
Transformer-based Persian language understanding model, optimized for Persian Named Entity Recognition (NER) tasks
Downloads 6,130
Release Time : 3/2/2022

Model Overview

ParsBERT is a monolingual Persian model based on the BERT architecture, excelling on Persian NER datasets like ARMAN and PEYMA, supporting recognition of 7 entity types

Model Features

Whole Word Masking Training
Utilizes Whole Word Masking technology to enhance Persian entity recognition performance
Dual Dataset Support
Supports both PEYMA and ARMAN major Persian NER benchmark datasets
SOTA Performance
Achieves 98.79 F1 score on PEYMA dataset, significantly outperforming other Persian NER models

Model Capabilities

Persian text entity recognition
Organization name detection
Geographical name recognition
Person name extraction
Time/date recognition
Currency/percentage detection

Use Cases

Information Extraction
News Text Analysis
Automatically extracts key entities like person names and organizations from Persian news
Achieves 93.10 F1 score on ARMAN dataset
Business Intelligence
Financial Document Processing
Identifies currency amounts and percentage data in Persian financial reports
Over 90% accuracy in currency recognition on PEYMA dataset
Featured Recommended AI Models
ยฉ 2025AIbase