G

Gpt2 Persian

Developed by bolbolzaban
A Persian language model based on the GPT2 architecture, specifically designed for Persian text generation with enhanced poetry processing capabilities.
Downloads 691
Release Time : 3/2/2022

Model Overview

This is a GPT2 model optimized for Persian, using SentencePiece tokenizer, specifically designed for Persian text generation and poetry research.

Model Features

Persian optimization
Specifically trained for Persian, all non-Persian characters are replaced with special tokens
Enhanced poetry processing
Supports special token formats for classical Persian poetry, such as [BOM] and [EOS]
Efficient tokenization
Uses Google SentencePiece tokenizer instead of standard BPE tokenizer
Computational optimization
Context length reduced from 1024 to 256 to lower training costs

Model Capabilities

Persian text generation
Classical poetry continuation
Persian language understanding

Use Cases

Literary creation
Persian poetry generation
Continue writing complete poems based on input Persian verses
Can generate texts that conform to classical Persian poetry metrics
Language research
Persian model research
Used to study the characteristics of Persian language models
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase