# Autoregressive model
Perceiver Ar Sam Giant Midi
Apache-2.0
A symbolic audio model based on the Perceiver AR architecture, pre-trained on the GiantMIDI-Piano dataset for symbolic audio generation
Audio Generation
Transformers

P
krasserm
153
11
Codegen 350m Html
Bsd-3-clause
CodeGen-HTML 350M is an autoregressive language model fine-tuned from CodeGen-Multi 350M, specifically designed for HTML code generation.
Large Language Model
Transformers Other

C
alecsharpie
132
14
Gpt J 6b
Apache-2.0
GPT-J 6B is a 6-billion-parameter autoregressive language model trained using the Mesh Transformer JAX framework, employing the same tokenizer as GPT-2/3.
Large Language Model English
G
EleutherAI
297.31k
1,493
Gpt Neo 125m
MIT
GPT-Neo 125M is a Transformer model based on the GPT-3 architecture, developed by EleutherAI, with 125 million parameters, primarily used for English text generation tasks.
Large Language Model English
G
EleutherAI
150.96k
204
Featured Recommended AI Models