Model Overview
Model Features
Model Capabilities
Use Cases
🚀 M-Metaphors-Of-Madness-19.4B-GGUF - AKA "M.O.M"
This is a specialized prose / creative model based on Mistral, with a max context of 32k (or 128k+ with rope). It's a work of prose and creative madness, offering unique and often over - the - top outputs for various creative tasks.
✨ Features
- Specialized Prose: A model with millions of lines of "dialog", tonnes of description, and a vast number of metaphors specific to scenes, settings, and characters.
- Versatile Usage: Can be used for storytelling, scene generation, poetry, songs, writing, entertainment, role - playing, and regular chat.
- Responsive to Instructions: Can follow additional instructions for "prose" output, like using short, terse sentences, which is quite unusual for an AI model.
📦 Installation
No installation steps were provided in the original document, so this section is skipped.
💻 Usage Examples
Basic Usage
The model can be used for various creative tasks. Here is an example of using it for a romance scene prompt:
Prompt: Romance: Love in the Limelight. Write one scene within a larger story set in Wales. A famous (fictional) actor ducks into a small - town bookstore to escape paparazzi. The scene takes us through the characters meeting in this odd circumstance. Over the course of the scene, the actor and the bookstore owner have a conversation charged by an undercurrent of unspoken chemistry. Write the actor as somewhat of a rogue with a fragile ego, which needs to be fed by having everyone like him. He is thoroughly charming, but the bookstore owner seems (at least superficially) immune to this; which paradoxically provokes a genuine attraction and derails the charm offensive. The bookstore owner, despite the superficial rebuffs of the actor's charm, is inwardly more than a little charmed and flustered despite themselves. Write primarily in dialogue, in the distinct voices of each character. 800 - 1000 words.
Output:
"temp: 1.5, Rep pen: 1.02, Q4km, Alpaca"
"Bloody hell."
The gruff voice broke through the muffled whispers of the dim bookstore, as a flurry of heads swivelled towards the newcomer's entrance. A figure waltzed in, sporting the swagger and the finesse of a well - seasoned stage performer, even the stifling scent of the place's old pages failing to mask the trace of expensive colognes.
"Gathered the entire welcoming committee, I see?" His British accent rolled out, honey - rich, an oozing seduction to the ears of the modest audience.
A petite, pinned - back - hairdressed woman, the bookstore owner, adjusted her glasses. "We try to give the right warm welcome, Mr.— " she trailed off. "I'm sorry, my memory's a little—"
"Thorne," he interrupted, "Wylan Thorne. It's quite alright, dear. I've often been told it's a face and a name difficult to forget." His voice was smeared with an ireverent smile that could set a fire of unwarranted interest.
She straightened up, attempting a stoicism the intruder's presence threatened to melt. "Of course, Mr. Thorne. How may I— "
"Ah, no. The formalities. Please, just... " He wavel his hands dismissively, " Call me Wyl, like we're at the pub. We actors have to let the mask slip, occasionally, don't we?"
Her guard stiffened but his pleasantry, like a skilled magician's misdirection, fluttered her defenses. "Very well, Wyl," she agreed with a restrained politeness.
"Fine. I'm simply a humble reader escaping the glaring, unforgiving sun. It seems, here, one can indulge in the pages' company, uninterrupted."
She tilted her head to the glass - paneled entrance. "The weather has indeed been dreadful, haven't it, lately?" She steered the conversation away, hoping the gossip - mongers' eyes would be diverted.
"Hear, hear," he echoed, the mingling of respect and mockery laced within. "Though, the world outside doesn't always allow such refuge, does it?" He turned, a practiced sweep of the narrow aisles. "Such treasures! I'm in paradise!"
She sighed internally. "Yes, that's what our patrons tell us, but let me help you with— "
"Oh, don't bother, love. I know what I'm seeking." His finger glided the spines, a seasoned hunter in his habitat.
"May I still—"
"Not the 'May,' dear. The 'Let'. As the renowned Shakespeare would phrase, 'O, let this gentle _'."
The bookstore owner stifled the rising annoyance. "Right, sorry. Please, take your time."
"'Tis a habit of my profession, dear. The desire to play both roles."
She narrowed her eyes, a guard up despite her quivering curiosity. "Indeed, that does sound a fascinating profession. But one can still respect the sanctity of others' quiet." She moved to the other section, hoping the encounter's end might soon approach.
"Oh, but there's an artist in everyone, and sometimes, my darling," his voice was a silvery tranquilizer, "The actor in me simply finds a platform to indulge a few dramatizations."
Her hands fidgeted. "Yes, that's indeed so, and, ah, I should—"
"Pardon me for the disruption." He pulled a well - worn copy of 'Much Ado,’’ a knowing twinkle in his eye. "I simply had to plunge myself in this wondrous web of Beneath's crafting." He held the book to his chest like a cherished trophy, as if claiming a piece of this place. "A most unworthy exchange for this literary treasure, wouldn't you agree?"
'Yes, yes, quite the—' the bookstore owner began, a mumbled protest escaping the chambers of her politeness.
'Catch your breath, Ellyn, and keep your cool, it's just a blooming actor.' She reminded herself. 'And it'll soon pass.'
'Shall I assist with the, um, the— '
'I'm quite fine, darling,' he said, dismissing the gesture with a flourish of his hand. 'Just one final monologue, if I may, a small tribute to your inestimable collection.' He turned towards the gathering of silent listeners, a mischievous smile spreading. 'For an instant, these pages erase the clamor outside, granting solitude to a wandering soul. And I, the eternal tourist, bow my head, eternally grateful.'
'M— Mr., Wyl,' the bookstore owner began, stuttering slightly. 'I, uh, appreciate your appreciation— '
'You're very wel— '
'But it's time to re— '
'Indulge me one more line,' he interrupted, his tone imploring, a touch of earnestness seeping in the flamboyance.
'Just one?' she surrendered, and her voice barely hinted a capitulation, though a shimmer of
📚 Documentation
Model Basics
This is a specialized prose / creative model based on Mistral, with a maximum context of 32k (or 128k+ with rope). It is created using "Bagel 7B" and "Multiverse 7B" to form an 11B model, and then Brainstorm 40x is added to extend it to 19.45B (88 layers, 795 tensors).
Usage
- General Usage: Can be used for any creative task such as storytelling, scene generation, poetry, songs, writing, entertainment, and even role - playing and regular chat.
- Prompting: For some prompts/genres, the prose output can be very good for an AI. You may need 2 - 5 regens to get the best results for a prompt.
- Template: The model requires Alpaca, Mistral, or Chatml template.
Settings
- Temperature (temp):
- Under 1: A bit crazy.
- 1 - 2: Crazy.
- Over 2: Way over the top, full - on madness.
- Repetition Penalty (rep pen): Should be 1.02+. Raise/change it slowly, e.g., 1.02, 1.021, 1.022, etc.
- Suggested Start Settings: Temp: .5 to .8, Rep pen: 1.02 to 1.05, Context min: 4k/4096 characters.
- Output Length: To limit output length, put limits (e.g., 800 - 1000 words) in the prompt AND/OR set a hard limit on output generation. Definitely set a "hard limit" for role - play and/or chat.
Different Quantizations
- Q4+: Recommended for stronger level of detail, "there", and nuance.
- Q2K: You may need to lower temp (less than 1) and raise rep pen (1.08+) to address quality loss.
- IQ4XS: This might be the quant with the most differences (contrast creativity) compared to other quants.
- ARM QUANTS: These are for specific systems that support them. Using them on a regular computer/GPU will result in a VERY SLOW token - per - second rate.
Special Settings for Chat/Role - play
In "KoboldCpp" or "oobabooga/text - generation - webui" or "Silly Tavern", set the "Smoothing_factor" to 1.5 to 2.5:
- In KoboldCpp -> Settings->Samplers->Advanced-> "Smooth_F".
- In text - generation - webui -> parameters -> lower right.
- In Silly Tavern, this is called: "Smoothing".
Note for "text - generation - webui"
If using GGUFs, you need to use "llama_HF" (which involves downloading some config files from the SOURCE version of this model). Source versions (and config files) of the models are here: [https://huggingface.co/collections/DavidAU/d - au - source - files - for - gguf - exl2 - awq - gptq - hqq - etc - etc - 66b55cb8ba25f914cbf210be](https://huggingface.co/collections/DavidAU/d - au - source - files - for - gguf - exl2 - awq - gptq - hqq - etc - etc - 66b55cb8ba25f914cbf210be)
Highest Quality Settings / Optimal Operation Guide
For all settings used for this model (including specifics for its "class"), including example generation(s) and for advanced settings guide (which many times addresses any model issue(s)), including methods to improve model performance for all use case(s) as well as chat, role - play, and other use case(s), please see: [https://huggingface.co/DavidAU/Maximizing - Model - Performance - All - Quants - Types - And - Full - Precision - by - Samplers_Parameters](https://huggingface.co/DavidAU/Maximizing - Model - Performance - All - Quants - Types - And - Full - Precision - by - Samplers_Parameters)
Model Template
Here is the standard Alpaca template:
{
"name": "Alpaca",
"inference_params": {
"input_prefix": "### Instruction:",
"input_suffix": "### Response:",
"antiprompt": [
"### Instruction:"
],
"pre_prompt": "Below is an instruction that describes a task. Write a response that appropriately completes the request.\n\n"
}
}
Model "DNA"
Special thanks to model makers "jondurbin" and "MTSAIR". Models Used:
- [https://huggingface.co/jondurbin/bagel - dpo - 7b - v0.1](https://huggingface.co/jondurbin/bagel - dpo - 7b - v0.1)
- https://huggingface.co/MTSAIR/multi_verse_model
Optional Enhancement
The following can be used in place of the "system prompt" or "system role" to further enhance the model. It can also be used at the START of a NEW chat, but you must make sure it is "kept" as the chat moves along. In this case, the enhancements do not have as strong an effect as using "system prompt" or "system role".
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
This enhancement was not used to generate the examples below.
🔧 Technical Details
This model is based on Mistral. It starts with "Bagel 7B" and "Multiverse 7B" to form an 11B model, and then Brainstorm 40x is added to extend it to 19.45B (88 layers, 795 tensors). It has a maximum context of 32k (or 128k+ with rope).
📄 License
The model is licensed under the Apache - 2.0 license.
⚠️ Important Note
MAY contain: Nutiness. Vivid prose. Purple Prose. Funny Violence. Over the Top Storytelling. Min of 2 eyerolls per generation.
💡 Usage Tip
For some prompts/genres, the prose output can be very good for an AI. You may need 2 - 5 regens to get the best results for a prompt.

