# Autoregressive language model
Gpt2 Zinc 87m
MIT
An autoregressive language model based on GPT2 architecture, specifically designed for generating drug-like molecules or embedding representations from SMILES strings
Molecular Model
Transformers

G
entropy
404
3
Polyglot Ko 1.3b
Apache-2.0
Polyglot-Ko is one of the Korean autoregressive language model series developed by EleutherAI's multilingual team, containing 1.3 billion parameters and specifically optimized for Korean.
Large Language Model
Transformers Korean

P
EleutherAI
121.13k
83
Codegen 16B Mono
Bsd-3-clause
CodeGen-Mono 16B is an autoregressive language model for program synthesis, specializing in generating executable code from English prompts.
Large Language Model
Transformers

C
Salesforce
227
126
Gpt2 Finetuned Greek
Apache-2.0
A Greek text generation model fine-tuned from the English GPT-2 model, jointly developed by the Hellenic Military Academy and the Technical University of Crete
Large Language Model Other
G
lighteternal
178
7
Ko Gpt Trinity 1.2B V0.5
A 1.2 billion parameter Korean Transformer model based on the GPT-3 architecture, developed by SK Telecom, primarily used for Korean text generation and comprehension tasks.
Large Language Model
Transformers Korean

K
skt
1,294
44
Xglm 7.5B
MIT
XGLM-7.5B is a multilingual autoregressive language model with 7.5 billion parameters, supporting 30+ languages, trained on a diverse corpus of 500 billion subword tokens.
Large Language Model
Transformers Supports Multiple Languages

X
facebook
1,260
57
Featured Recommended AI Models