đ The-True-Abomination-24B
This is a merged pre - trained language model created using mergekit, aiming to combine the strengths of multiple models for text generation.
đ Quick Start
This model is a merge of pre - trained language models created using mergekit. It can be used for text generation tasks.
⨠Features
đ Merge Details
đ Description
This model is a merge created with the same idea as [Casual - Autopsy/L3 - Super - Nova - RP - 8B](https://huggingface.co/Casual - Autopsy/L3 - Super - Nova - RP - 8B) in mind: compatability with as many SillyTavern features and extensions as possible while still being able to handle itself in roleplay.
Reasoning isn't perfect, but it certainly helps boost the model's capability with legacy reasoning(Status block/thinking box CoT) (Which I honestly prefer over modern reasoning. Less immersion breaking IMO.) Setting max reasoning prompts to 3 or more and/or injecting a CoT formatting prompt is recommended.
âī¸ Merge Method
This model was merged using the SCE, Della, and CABS merge methods using [TheDrummer/Cydonia - 24B - v2](https://huggingface.co/TheDrummer/Cydonia - 24B - v2) as a base.
đ Models Merged
The following models were included in the merge:
- [TheDrummer/Cydonia - 24B - v2](https://huggingface.co/TheDrummer/Cydonia - 24B - v2)
- [ReadyArt/Gaslit - Transgression - 24B - v1.0](https://huggingface.co/ReadyArt/Gaslit - Transgression - 24B - v1.0)
- [ReadyArt/Forgotten - Safeword - 24B - v4.0](https://huggingface.co/ReadyArt/Forgotten - Safeword - 24B - v4.0)
- [ReadyArt/The - Omega - Directive - M - 24B - v1.1](https://huggingface.co/ReadyArt/The - Omega - Directive - M - 24B - v1.1)
- [ReadyArt/Omega - Darker_The - Final - Directive - 24B](https://huggingface.co/ReadyArt/Omega - Darker_The - Final - Directive - 24B)
- [Mawdistical/Mawdistic - NightLife - 24b](https://huggingface.co/Mawdistical/Mawdistic - NightLife - 24b)
- [Undi95/MistralThinker - v1.1](https://huggingface.co/Undi95/MistralThinker - v1.1)
- [cognitivecomputations/Dolphin - Mistral - 24B - Venice - Edition](https://huggingface.co/cognitivecomputations/Dolphin - Mistral - 24B - Venice - Edition)
- [AlexBefest/CardProjector - 24B - v3](https://huggingface.co/AlexBefest/CardProjector - 24B - v3)
- [arcee - ai/Arcee - Blitz](https://huggingface.co/arcee - ai/Arcee - Blitz)
- [TroyDoesAI/BlackSheep - 24B](https://huggingface.co/TroyDoesAI/BlackSheep - 24B)
âī¸ Configuration
The following YAML configurations were used to produce this model:
đ Gaslit - Safeword
models:
- model: TheDrummer/Cydonia-24B-v2
- model: ReadyArt/Forgotten-Safeword-24B-v4.0
parameters:
weight: 0.4
density: 0.35
epsilon: 0.3
- model: ReadyArt/Gaslit-Transgression-24B-v1.0
parameters:
weight: 0.4
density: 0.35
epsilon: 0.3
merge_method: della
base_model: TheDrummer/Cydonia-24B-v2
parameters:
normalize: true
dtype: bfloat16
đ Omega - Duo
models:
- model: TheDrummer/Cydonia-24B-v2
- model: ReadyArt/The-Omega-Directive-M-24B-v1.1
parameters:
weight: 0.4
density: 0.35
epsilon: 0.3
- model: ReadyArt/Omega-Darker_The-Final-Directive-24B
parameters:
weight: 0.4
density: 0.35
epsilon: 0.3
merge_method: della
base_model: TheDrummer/Cydonia-24B-v2
parameters:
normalize: true
dtype: bfloat16
đ SCE - Abomination
models:
- model: TheDrummer/Cydonia-24B-v2
- model: Mawdistical/Mawdistic-NightLife-24b
- model: Gaslit-Safeword
- model: Omega-Duo
merge_method: sce
base_model: TheDrummer/Cydonia-24B-v2
parameters:
select_topk: 0.8
dtype: bfloat16
đ UNC - Reasoning
models:
- model: SCE-Abomination
- model: Undi95/MistralThinker-v1.1
parameters:
weight: 0.6
n_val: 16
m_val: 32
- model: cognitivecomputations/Dolphin-Mistral-24B-Venice-Edition
parameters:
weight: 0.4
n_val: 11
m_val: 33
merge_method: cabs
default_n_val: 8
default_m_val: 32
pruning_order:
- Undi95/MistralThinker-v1.1
- cognitivecomputations/Dolphin-Mistral-24B-Venice-Edition
base_model: SCE-Abomination
dtype: bfloat16
đ INT - Multitasks
models:
- model: SCE-Abomination
- model: AlexBefest/CardProjector-24B-v3
parameters:
weight: 0.6
n_val: 16
m_val: 32
- model: arcee-ai/Arcee-Blitz
parameters:
weight: 0.4
n_val: 11
m_val: 33
merge_method: cabs
default_n_val: 8
default_m_val: 32
pruning_order:
- AlexBefest/CardProjector-24B-v3
- arcee-ai/Arcee-Blitz
base_model: SCE-Abomination
dtype: bfloat16
đ The - True - Abomination - 24B
models:
- model: SCE-Abomination
- model: TroyDoesAI/BlackSheep-24B
- model: UNC-Reasoning
- model: INT-Multitasks
merge_method: sce
base_model: SCE-Abomination
parameters:
select_topk: 0.45
dtype: bfloat16
đ Documentation
đ Model Information
Property |
Details |
Base Models |
TheDrummer/Cydonia - 24B - v2, ReadyArt/Gaslit - Transgression - 24B - v1.0, ReadyArt/Forgotten - Safeword - 24B - v4.0, etc. |
Library Name |
transformers |
Tags |
mergekit, merge, sce, della, cabs, not - for - all - audiences, rp, roleplay, role - play |
Language |
en |
Pipeline Tag |
text - generation |