# GPT-2 architecture

Zenz V2.5 Small
A conditional language model based on the GPT-2 architecture, specifically designed for Japanese kana-kanji conversion tasks, suitable for the Zenzai neural kana-kanji conversion system
Large Language Model Japanese
Z
Miwa-Keita
18
1
Turkish Gpt2 Medium
MIT
This is a medium-sized text generation model based on GPT-2, specifically designed for the Turkish language, capable of coherently continuing given text fragments.
Large Language Model Transformers
T
ytu-ce-cosmos
882
9
Arabiangpt 03B
Apache-2.0
ArabianGPT-0.3B is a GPT-2 model optimized specifically for Arabic, developed by the Robotics and Internet of Things Laboratory at Prince Sultan University, and optimized for the complex characteristics of Arabic.
Large Language Model Transformers Arabic
A
riotu-lab
114
26
Lamini GPT 1.5B
LaMini-GPT-1.5B is a large language model fine-tuned based on the GPT-2-xl architecture, belonging to the LaMini-LM series, focusing on instruction-following tasks
Large Language Model Transformers English
L
MBZUAI
365
38
Codeparrot
CodeParrot is a Python code generation model based on the GPT-2 architecture (1.5 billion parameters), focusing on automatic Python code generation.
Large Language Model Transformers Other
C
codeparrot
1,342
105
Belgpt2
MIT
BelGPT-2 is a GPT-2 model pre-trained on a massive French corpus (approximately 60GB), specializing in French text generation tasks.
Large Language Model French
B
antoinelouis
773
7
Gpt2 Small Portuguese Finetuned Peticoes
MIT
This is a fine-tuned version of the small Portuguese GPT-2 model, specifically optimized for petition (peticoes) texts.
Large Language Model Transformers Other
G
Luciano
18
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase