C

Chupacabra 7B V2

Developed by perlthoughts
A 7B-parameter large language model based on the Mistral architecture, utilizing SLERP fusion technology to merge weights from multiple high-performance models
Downloads 99
Release Time : 11/21/2023

Model Overview

This model integrates multiple Mistral-based models through advanced SLERP fusion technology, excelling in text generation tasks and supporting various natural language processing tasks

Model Features

SLERP fusion technology
Uses Spherical Linear Interpolation (SLERP) instead of traditional weight averaging to better preserve parent model characteristics
High-performance training
Integrates model weights trained with advanced methods such as DPO, SFT, and reinforcement learning
Multi-task optimization
Demonstrates excellent performance across multiple benchmarks including ARC, HellaSwag, and MMLU

Model Capabilities

Text generation
Question answering systems
Reasoning tasks
Common sense understanding

Use Cases

Education
Academic Q&A
Answers questions across various academic disciplines
Achieves 63.6% accuracy on the MMLU test
Research
Reasoning challenges
Solves complex reasoning problems
Achieves 65.19% normalized accuracy in the AI2 Reasoning Challenge
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase