Mixtral 7b 8expert
M
Mixtral 7b 8expert
Developed by DiscoResearch
The latest Mixture of Experts (MoE) model released by MistralAI, supporting multilingual text generation tasks
Downloads 57.47k
Release Time : 12/8/2023
Model Overview
This is a large language model based on the Mixture of Experts architecture, supporting text generation tasks in multiple languages including English, French, Italian, Spanish, and German.
Model Features
Mixture of Experts Architecture
Utilizes an 8-expert mixture architecture for more efficient handling of tasks across different domains
Multilingual Support
Supports multiple languages including English, French, Italian, Spanish, and German
High Performance
Demonstrates excellent performance across multiple benchmarks, such as hella swag (0.8661), MMLU (0.7173), etc.
Model Capabilities
Multilingual Text Generation
Knowledge Q&A
Logical Reasoning
Use Cases
Text Generation
Multilingual Content Creation
Generate creative text content in various languages
Q&A System
Knowledge Q&A
Answer knowledge-based questions across various domains
Achieved an accuracy of 0.7173 on the MMLU benchmark
Featured Recommended AI Models
Š 2025AIbase