S

SOLAR Math 2x10.7b V0.2

Developed by macadeliccc
A large language model created by merging two Solar-10.7B instruction-tuned models, with performance comparable to GPT-3.5 and Gemini Pro, surpassing Mixtral-8x7b
Downloads 92
Release Time : 1/16/2024

Model Overview

SOLAR-math-2x10.7b-v0.2 is a high-performance large language model focused on mathematical reasoning and text generation tasks. It achieves enhanced performance by merging two Solar-10.7B instruction-tuned models, demonstrating excellent results across multiple benchmarks.

Model Features

High-performance mathematical reasoning
Excels in mathematical reasoning tasks (e.g. GSM8k) with an accuracy rate of 64.9%
Multi-task processing capability
Demonstrates balanced performance across multiple benchmarks (ARC, HellaSwag, MMLU, etc.)
Merged model architecture
Achieves performance improvement by merging two Solar-10.7B models

Model Capabilities

Mathematical problem solving
Common sense reasoning
Text generation
Multiple-choice question answering
Language understanding

Use Cases

Education
Mathematical problem solving
Solves various mathematical problems, including complex theorem proofs
Achieves 64.9% accuracy on GSM8k test set
Scientific knowledge explanation
Explains complex scientific concepts and theories
Achieves 66.25% accuracy on MMLU test set
Research
Academic literature analysis
Assists researchers in understanding and analyzing academic literature
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase