AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
65k Long Context

# 65k Long Context

Mixtral 8x22B V0.1 GGUF
Apache-2.0
Mixtral 8x22B is a 176-billion-parameter mixture of experts model released by MistralAI, supporting multilingual text generation tasks.
Large Language Model Supports Multiple Languages
M
MaziyarPanahi
170.27k
74
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase