B

Bert Fa Base Uncased Ner Arman

Developed by HooshvareLab
A Transformer-based Persian language understanding model, reconstructed vocabulary and fine-tuned on new corpora, expanding multi-domain application capabilities
Downloads 110
Release Time : 3/2/2022

Model Overview

ParsBERT is a BERT model optimized for Persian, primarily used for natural language understanding tasks such as Named Entity Recognition (NER). Version 2.0 improves model performance and application scope by reconstructing the vocabulary and fine-tuning on new corpora.

Model Features

Optimized Persian Processing
Specifically reconstructed vocabulary for Persian to enhance language understanding
Multi-domain Applicability
Fine-tuned on new corpora, expanding the model's application across different domains
High-performance NER
Achieves an F1 score of 99.84 in Persian Named Entity Recognition tasks

Model Capabilities

Persian text understanding
Named Entity Recognition
Token classification

Use Cases

Information Extraction
Persian Text Entity Recognition
Identify entities such as organizations, locations, and people from Persian text
Achieved an F1 score of 99.84 on the ARMAN dataset
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase