HMS Slerp 12B V2
This is a 12B-parameter multilingual large language model fused using the SLERP method, supporting English and Japanese processing.
Large Language Model
Transformers Supports Multiple Languages#Multilingual Dialogue#Japanese-English Mix#ChatML Format

Downloads 16
Release Time : 4/27/2025
Model Overview
This model is a pre-trained language model created by fusing two 12B-parameter models (Himeyuri-Magnum-12B and Shisa-v2-Mistral-Nemo-12B-Abliterated) using the SLERP spherical linear interpolation method via the mergekit tool.
Model Features
Multilingual Support
Supports both English and Japanese processing, suitable for cross-language application scenarios
Model Fusion Technology
Uses SLERP spherical linear interpolation for model fusion, combining the strengths of two excellent models
Large-Scale Parameters
12B parameter scale with strong language understanding and generation capabilities
Model Capabilities
Text generation
Multilingual processing
Dialogue systems
Use Cases
Dialogue Systems
Multilingual Chatbot
Build intelligent dialogue systems supporting English and Japanese
Content Generation
Cross-Language Content Creation
Generate creative text content in English or Japanese
Featured Recommended AI Models
Š 2025AIbase