J

J.O.S.I.E.3 Beta12 7B Slerp

Developed by Goekdeniz-Guelmez
J.O.S.I.E.3-Beta12-7B-slerp is a 7B-parameter large language model created by merging Weyaxi/Einstein-v6-7B and argilla/CapybaraHermes-2.5-Mistral-7B models, supporting multilingual interaction and adopting the ChatML prompt format.
Downloads 17
Release Time : 4/23/2024

Model Overview

This model is a private super-intelligent AI assistant focused on providing high-quality conversation and Q&A services, supporting multiple languages and complex task processing.

Model Features

Multilingual Support
Supports interaction in 6 languages including Chinese
Merged Model Advantage
Combines the strengths of both Einstein-v6 and CapybaraHermes models through slerp merging
ChatML Format
Adopts standardized ChatML prompt format for easy integration into dialogue systems
Quantization Support
Provides GGUF quantized versions for easy deployment on different hardware

Model Capabilities

Multilingual text generation
Intelligent dialogue
Knowledge Q&A
Task completion

Use Cases

Personal Assistant
Personal AI Assistant
Serves as a daily personal assistant to answer various questions and provide advice
Achieved 83.98% normalized accuracy on the HellaSwag test set
Education
Subject Knowledge Q&A
Answers high school and university-level questions across various subjects
Achieved 79.8% accuracy in high school geography tests
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase