D

Discolm Mixtral 8x7b V2

Developed by DiscoResearch
Experimental 8x7b Mixture of Experts model developed based on Mistral AI's Mixtral 8x7b, fine-tuned on Synthia, MetaMathQA, and Capybara datasets
Downloads 205
Release Time : 12/9/2023

Model Overview

DiscoLM Mixtral 8x7b Beta is a large language model focused on text generation tasks, employing a Mixture of Experts architecture, suitable for dialogue systems and Q&A scenarios.

Model Features

Mixture of Experts Architecture
Employs an 8x7b parameter Mixture of Experts architecture, capable of efficiently handling complex tasks
Multi-dataset Fine-tuning
Fine-tuned on Synthia dialogue dataset, MetaMathQA mathematical reasoning dataset, and Capybara dataset
Community-driven Development
Initiated by DiscoResearch team, with computing power support from HessianAI and assistance from LAION team

Model Capabilities

Text Generation
Dialogue Systems
Q&A Systems
Mathematical Reasoning

Use Cases

Dialogue Systems
Intelligent Assistant
Serves as an AI assistant to answer user questions
Achieved an average score of 7.7 in MTBench tests
Mathematical Reasoning
Math Problem Solving
Solves mathematical reasoning problems
Achieved 65.6% accuracy in GSM8k tests
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase