O

Orthogonal 2x7B V2 Base

Developed by LoSboccacc
orthogonal-2x7B-v2-base is a Mixture of Experts model based on Mistral-7B-Instruct-v0.2 and SanjiWatsuki/Kunoichi-DPO-v2-7B, specializing in text generation tasks.
Downloads 80
Release Time : 1/18/2024

Model Overview

This model combines the capabilities of two expert models, each excelling in role-play and chat tasks respectively, dynamically selecting the most suitable expert model for text generation through a gating mechanism.

Model Features

Mixture of Experts architecture
Combines the capabilities of two 7B-parameter models, dynamically selecting the most suitable expert model through a gating mechanism.
Multi-task optimization
Utilizes specialized expert models for different tasks (e.g., role-play and chat) to improve generation quality.
Efficient inference
Compared to a single large model, the Mixture of Experts architecture can maintain performance while improving inference efficiency.

Model Capabilities

Text generation
Dialogue systems
Role-play
Question answering systems
Reasoning tasks

Use Cases

Dialogue systems
Intelligent chat assistant
Used to build natural and fluent dialogue systems
Achieved 85.69% accuracy on the HellaSwag dataset
Education
Scientific Q&A system
Answers science-related questions
Achieved 66.89% accuracy on the AI2 Reasoning Challenge
Mathematical reasoning
Math problem solving
Solves basic math problems
Achieved 51.4% accuracy on the GSM8k dataset
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase