Cydonia V1.2 Magnum V4 22B
A 22B-parameter language model merged from Cydonia-22B-v1.2 and Magnum-v4-22b using the SLERP method
Downloads 52
Release Time : 10/26/2024
Model Overview
This is a 22B-parameter language model merged using the mergekit tool, combining the strengths of both Cydonia and Magnum models, suitable for tasks like text generation.
Model Features
Model merging technique
Uses SLERP method to fuse two 22B-parameter models, preserving their respective strengths
Large parameter scale
22B parameters provide stronger language understanding and generation capabilities
Based on Mistral architecture
Inherits the efficient characteristics of the Mistral architecture
Model Capabilities
Text generation
Language understanding
Use Cases
Content creation
Creative writing
Generating creative texts such as stories and poems
Dialogue systems
Intelligent conversation
Building more natural conversational AI
Featured Recommended AI Models
Š 2025AIbase