đ ALBERT Persian
A Lite BERT for Self-supervised Learning of Language Representations for the Persian Language
ALBERT-Persian is the first attempt on ALBERT for the Persian Language. It can also be called "BERT-Kucholo". ALBERT-Persian is trained based on Google's ALBERT BASE Version 2.0. The training data consists of over 3.9M documents, 73M sentences, and 1.3B words from various writing styles and numerous subjects (e.g., scientific, novels, news), similar to the approach used for ParsBERT.
Please follow the ALBERT-Persian repo for the latest information about previous and current models.
đ Quick Start
⨠Features
- It's the first ALBERT-based model for the Persian language.
- Trained on a large and diverse dataset of Persian texts.
đ Documentation
Persian Text Classification [DigiMag, Persian News]
The task aims to label texts in a supervised manner in both existing datasets DigiMag
and Persian News
.
DigiMag
A total of 8,515 articles were scraped from Digikala Online Magazine. This dataset includes seven different classes:
- Video Games
- Shopping Guide
- Health Beauty
- Science Technology
- General
- Art Cinema
- Books Literature
Label |
# |
Video Games |
1967 |
Shopping Guide |
125 |
Health Beauty |
1610 |
Science Technology |
2772 |
General |
120 |
Art Cinema |
1667 |
Books Literature |
254 |
Download
You can download the dataset from here
Results
The following table summarizes the F1 score obtained by ParsBERT as compared to other models and architectures.
Dataset |
ALBERT-fa-base-v2 |
ParsBERT-v1 |
mBERT |
Digikala Magazine |
92.33 |
93.59 |
90.72 |
đ License
This project is licensed under the Apache-2.0 license.
BibTeX entry and citation info
Please cite in publications as the following:
@misc{ALBERTPersian,
author = {Mehrdad Farahani},
title = {ALBERT-Persian: A Lite BERT for Self-supervised Learning of Language Representations for the Persian Language},
year = {2020},
publisher = {GitHub},
journal = {GitHub repository},
howpublished = {\url{https://github.com/m3hrdadfi/albert-persian}},
}
@article{ParsBERT,
title={ParsBERT: Transformer-based Model for Persian Language Understanding},
author={Mehrdad Farahani, Mohammad Gharachorloo, Marzieh Farahani, Mohammad Manthouri},
journal={ArXiv},
year={2020},
volume={abs/2005.12515}
}
Questions?
Post a Github issue on the ALBERT-Persian repo.