B

Bert Base Parsbert Peymaner Uncased

Developed by HooshvareLab
ParsBERT is a Persian language understanding model based on the BERT architecture, specifically optimized for Persian Named Entity Recognition (NER) tasks.
Downloads 40
Release Time : 3/2/2022

Model Overview

This model is optimized for Persian Named Entity Recognition tasks, capable of identifying entities such as person names, locations, and organizations in text, using the IOB annotation format.

Model Features

Whole Word Masking Training
Utilizes Whole Word Masking (WWM) technology for pre-training, enhancing the model's understanding of Persian language.
Multi-dataset Support
Supports two major Persian NER datasets, ARMAN and PEYMA, as well as their combinations.
IOB Annotation Format
Uses the standard IOB format for entity annotation, facilitating integration with other systems.

Model Capabilities

Persian Text Understanding
Named Entity Recognition
Entity Classification Annotation

Use Cases

Information Extraction
News Entity Extraction
Extract key information such as person names, organization names, and locations from Persian news texts.
Accurately identifies various named entities in Persian texts.
Social Media Analysis
Analyze mentioned entities in Persian social media content.
Helps understand the people and organizations related to topics.
Featured Recommended AI Models
ยฉ 2025AIbase