B

Bert Fa Zwnj Base

Developed by HooshvareLab
A Persian language understanding model based on Transformer architecture, capable of handling zero-width non-joiner issues in Persian writing
Downloads 5,590
Release Time : 3/2/2022

Model Overview

ParsBERT is a monolingual Persian language model based on the BERT architecture, pretrained on a large-scale multi-genre Persian corpus, suitable for Persian natural language processing tasks

Model Features

Zero-Width Non-Joiner Handling
Capable of correctly handling zero-width non-joiner issues in Persian writing
Multi-Genre Corpus Training
Trained on a Persian corpus containing various genres such as scientific literature, novels, and news
Optimized Vocabulary
Utilizes a newly designed vocabulary to enhance Persian language processing effectiveness

Model Capabilities

Persian text understanding
Persian text classification
Persian question answering systems
Persian named entity recognition

Use Cases

Natural Language Processing
Persian Text Classification
Classifying Persian news, comments, and other content
Persian Question Answering System
Building intelligent Persian question answering applications
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase