# Autoregressive Model

Xglm 564M
MIT
XGLM-564M is a multilingual autoregressive language model with 564 million parameters, trained on a balanced corpus of 30 languages totaling 500 billion subwords.
Large Language Model Supports Multiple Languages
X
facebook
11.13k
51
Liquid V1 7B
MIT
Liquid is an autoregressive generation paradigm that achieves seamless fusion of visual understanding and generation by tokenizing images into discrete codes and learning these code embeddings alongside text tokens in a shared feature space.
Text-to-Image Transformers English
L
Junfeng5
11.35k
84
Lumina Mgpt 7B 512
Lumina-mGPT is a family of multimodal autoregressive models excelling in various vision and language tasks, particularly in generating flexible and realistic images from text descriptions.
Text-to-Image
L
Alpha-VLLM
1,185
4
Lumina Mgpt 7B 1024
Lumina-mGPT is a family of multimodal autoregressive models, excelling in generating flexible and realistic images from text descriptions and capable of performing various vision and language tasks.
Text-to-Image
L
Alpha-VLLM
27
9
Lumina Mgpt 7B 768
Lumina-mGPT is a family of multimodal autoregressive models, excelling in generating flexible and realistic images from text descriptions, and capable of performing various vision and language tasks.
Text-to-Image Transformers
L
Alpha-VLLM
1,944
33
Lumina Mgpt 7B 768 Omni
Lumina-mGPT is a series of multimodal autoregressive models, excelling in generating flexible and realistic images from text descriptions.
Text-to-Image Transformers
L
Alpha-VLLM
264
7
Codellama 7b Instruct Hf
Code Llama is a series of code generation and comprehension models released by Meta, including pre-trained and fine-tuned versions with parameters ranging from 7B to 34B. This model is the 7B-parameter instruction fine-tuned version, specifically optimized for code assistant scenarios.
Large Language Model Transformers Other
C
meta-llama
28.32k
48
Codellama 7b Hf
Code Llama is a series of code generation and understanding models from Meta with parameter scales ranging from 7B to 34B. This version is the 7B base model.
Large Language Model Transformers Other
C
meta-llama
4,650
101
Codellama 7b Python Hf
Code Llama is a 7-billion-parameter Python-specific code generation model launched by Meta, optimized based on the Llama 2 architecture, focusing on Python code synthesis and comprehension tasks
Large Language Model Transformers Other
C
codellama
26.36k
141
Codegen2 1B P
Apache-2.0
CodeGen2 is a series of autoregressive language models for program synthesis, featuring infill capability and supporting multiple programming languages.
Large Language Model Transformers
C
Salesforce
1,740
40
Codegen 350M Mono
Bsd-3-clause
CodeGen is a series of autoregressive language models for program synthesis, pre-trained on Python programming language datasets.
Large Language Model Transformers
C
Salesforce
23.59k
93
GPT Neo 1.3B Adventure
MIT
A fine-tuned version of EleutherAI's GPT-Neo 1.3B model, specialized in adventure-style text generation.
Large Language Model Transformers English
G
KoboldAI
141
8
Xglm 2.9B
MIT
XGLM-2.9B is a multilingual autoregressive language model with 2.9 billion parameters, trained on a diverse and balanced corpus of 500 billion subword tokens across multiple languages.
Large Language Model Transformers Supports Multiple Languages
X
facebook
229
9
Xglm 1.7B
MIT
XGLM-1.7B is a multilingual autoregressive language model with 1.7 billion parameters, trained on a diverse and balanced corpus of 500 billion subword tokens.
Large Language Model Transformers Supports Multiple Languages
X
facebook
1,514
19
Xglm 4.5B
MIT
XGLM-4.5B is a multilingual autoregressive language model with 4.5 billion parameters, trained on a balanced corpus of 134 languages.
Large Language Model Transformers Supports Multiple Languages
X
facebook
78
20
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase