F

Fabert

Developed by sbunlp
BERT pretrained model based on Persian blog training, excelling in multiple Persian NLP tasks
Downloads 627
Release Time : 2/9/2024

Model Overview

FaBERT is a Persian BERT base model trained on diverse HmBlogs corpus, covering both spoken and written Persian texts, suitable for various natural language processing tasks.

Model Features

Diverse Corpus Training
Trained on over 50GB of Persian blog data (HmBlogs corpus), covering both spoken and written language
Compact and Efficient
124 million parameter scale, delivering outstanding performance while maintaining model compactness
Multitask Adaptability
Validated across multiple natural language understanding tasks, easily fine-tuned for downstream applications

Model Capabilities

Text understanding
Sentiment analysis
Named entity recognition
Question answering systems
Natural language inference

Use Cases

Sentiment analysis
Opinion mining
Analyzing sentiment tendencies in Persian texts
Achieved 87.51% accuracy on MirasOpinion dataset
Named entity recognition
Entity recognition
Identifying entities such as person names and locations in Persian texts
F1 score of 91.39 on PEYMA dataset
Question answering systems
Persian QA
Answering questions based on Persian texts
EM score of 55.87 on ParsiNLU dataset
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase