Magtie V1 12B
M
Magtie V1 12B
Developed by grimjim
MagTie-v1-12B is a 12B-parameter language model merged using the DARE TIES algorithm, combining the strengths of multiple pre-trained models
Downloads 32
Release Time : 5/3/2025
Model Overview
This model merges multiple pre-trained models with 12B parameters each, aiming to preserve the excellent features of contributing models, suitable for text generation tasks
Model Features
Multi-model merging
Uses DARE TIES algorithm to merge multiple 12B-parameter models while preserving advantageous features from each
Weight optimization
Optimizes model performance by adjusting weights and density parameters (0.85/0.75)
Pre-training foundation
Built upon Mistral-Nemo-2407 base model
Model Capabilities
Text generation
Language understanding
Use Cases
Content creation
Creative writing
Generating creative texts like stories and poems
Dialogue systems
Intelligent chatbot
Building natural and fluent conversation systems
Featured Recommended AI Models
Š 2025AIbase