Blue Orchid 2x7b
A MoE (Mixture of Experts) Mistral model focused on role-playing, combining the strengths of role-playing and story creation
Downloads 274
Release Time : 1/30/2024
Model Overview
A Mixture of Experts model based on SanjiWatsuki/Kunoichi-DPO-v2-7B, consisting of two expert models fused together: one specialized in role-playing and the other in story creation.
Model Features
Mixture of Experts Architecture
Combines the strengths of role-playing and story creation experts to provide more comprehensive text generation capabilities
Multi-model Fusion
Expert 1 integrates multiple role-playing models, Expert 2 integrates multiple story creation models
Supports Multiple Prompt Templates
Compatible with LimaRP and Alpaca prompt template formats
Model Capabilities
Role-playing dialogue
Story creation
Long text generation
Multi-turn dialogue
Use Cases
Entertainment
Interactive Role-playing
Engage in immersive dialogue interactions with AI characters
Provides coherent character responses and plot development
Content Creation
Story Generation
Assists writers in story creation and plot development
Generates creative storylines and character dialogues
Featured Recommended AI Models