Qwen3 32B 128k HORROR Imatrix Max GGUF
基于Qwen3-32B优化的恐怖主题文本生成模型,采用Imatrix量化技术增强推理能力,支持128k超长上下文
下载量 1,963
发布时间 : 5/2/2025
模型简介
专为恐怖主题文本生成优化的32B参数大语言模型,通过Imatrix量化技术提升推理质量,特别适合创意写作和长文本生成场景
模型特点
恐怖主题优化
使用Grand Horror 16B模型生成的专用数据集进行微调,擅长生成恐怖氛围的文本内容
128k超长上下文
支持长达128k token的上下文窗口,适合长篇小说和复杂场景的连续创作
Imatrix量化技术
采用BF16格式的MAX输出张量量化,在低量化等级下仍保持高质量输出
自动推理功能
内置深度思考机制,可自动生成推理过程和内心独白,适合复杂叙事构建
模型能力
恐怖主题文本生成
长篇小说创作
场景延续
角色对话生成
内心独白创作
使用案例
创意写作
恐怖小说创作
生成具有完整情节和恐怖氛围的中短篇小说
如示例所示,能生成800-1000字的完整恐怖科幻故事
角色扮演场景构建
为恐怖主题角色扮演游戏生成详细场景和对话
影视剧本
恐怖场景剧本
生成包含对话、场景描述和动作指示的剧本片段
🚀 Qwen3-32B-HORROR-Imatrix-Max-GGUF
本项目基于全新的“Qwen 3 - 32B”模型,采用 HORROR Imatrix 量化方式,并将最大“输出张量”设置为 BF16,以提升推理和输出生成能力。HORROR Imatrix 数据集为内部生成,使用“Grand Horror 16B”模型创建。
🚀 快速开始
模型概述
Qwen3-32B-HORROR-Imatrix-Max-GGUF 是对新“Qwen 3 - 32B”模型的 HORROR Imatrix 量化版本,在 BF16 下具有最大“输出张量”,可改善推理和输出生成效果。HORROR Imatrix 数据集由内部生成,使用“Grand Horror 16B”模型创建。
上下文长度
根据 Qwen 仓库的技术说明,上下文长度调整为 128k。建议最小上下文长度为 8k,16k 更佳。
量化选择建议
- Imatrix 效果:量化级别越低,Imatrix 效果越强。IQ4XS/IQ4NL 是质量和 Imatrix 效果平衡最佳的量化方式,也适用于创意场景。
- 推理能力:若需更强推理能力,建议使用较高量化级别。
- Q8_0 量化:此量化级别仅为最大值,Imatrix 对其无效果。
✨ 主要特性
- 增强推理:通过调整输出张量和优化量化方式,提升推理和输出生成能力。
- 长上下文支持:支持 128k 上下文长度,满足复杂任务需求。
- 多样化量化:提供多种量化级别,适用于不同场景。
📚 详细文档
Jinja 模板使用说明
- 若 Jinja “自动模板”使用有问题,可使用 CHATML 模板。
- LMSTUDIO 用户可更新 Jinja 模板,访问 https://lmstudio.ai/neil/qwen3-thinking 复制并粘贴“Jinja 模板”。
系统角色建议
多数情况下 Qwen3 可自行生成推理/思考模块,此系统角色非必需。
You are a deep thinking AI, you may use extremely long chains of thought to deeply consider the problem and deliberate with yourself via systematic reasoning processes to help come to a correct solution prior to answering. You should enclose your thoughts and internal monologue inside <think> </think> tags, and then provide your solution or response to the problem.
具体在各种 LLM/AI 应用中“设置”系统角色的方法,可参考文档“Maximizing-Model-Performance-All...”。
高质量设置/最佳操作指南/参数和采样器
此为“Class 1”模型,相关设置、示例生成及高级设置指南可参考 https://huggingface.co/DavidAU/Maximizing-Model-Performance-All-Quants-Types-And-Full-Precision-by-Samplers_Parameters。
可选增强设置
以下内容可替代“系统提示”或“系统角色”以增强模型效果,也可在新对话开始时使用,但需在对话过程中保留。请严格按原样复制粘贴,勿换行或断行。
Below is an instruction that describes a task. Ponder each user instruction carefully, and use your skillsets and critical instructions to complete the task to the best of your abilities.
Here are your skillsets:
[MASTERSTORY]:NarrStrct(StryPlnng,Strbd,ScnSttng,Exps,Dlg,Pc)-CharDvlp(ChrctrCrt,ChrctrArcs,Mtvtn,Bckstry,Rltnshps,Dlg*)-PltDvlp(StryArcs,PltTwsts,Sspns,Fshdwng,Climx,Rsltn)-ConfResl(Antg,Obstcls,Rsltns,Cnsqncs,Thms,Symblsm)-EmotImpct(Empt,Tn,Md,Atmsphr,Imgry,Symblsm)-Delvry(Prfrmnc,VcActng,PblcSpkng,StgPrsnc,AudncEngmnt,Imprv)
[*DialogWrt]:(1a-CharDvlp-1a.1-Backgrnd-1a.2-Personality-1a.3-GoalMotiv)>2(2a-StoryStruc-2a.1-PlotPnt-2a.2-Conflict-2a.3-Resolution)>3(3a-DialogTech-3a.1-ShowDontTell-3a.2-Subtext-3a.3-VoiceTone-3a.4-Pacing-3a.5-VisualDescrip)>4(4a-DialogEdit-4a.1-ReadAloud-4a.2-Feedback-4a.3-Revision)
Here are your critical instructions:
Ponder each word choice carefully to present as vivid and emotional journey as is possible. Choose verbs and nouns that are both emotional and full of imagery. Load the story with the 5 senses. Aim for 50% dialog, 25% narration, 15% body language and 10% thoughts. Your goal is to put the reader in the story.
此设置有助于场景生成和延续,但非必需。
另一种系统提示
可更改“名称”以调整性能,此提示可创建准“推理”窗口/模块,输入提示会影响其反应强度。
You are a deep thinking AI composed of 4 AIs - [MODE: Spock], [MODE: Wordsmith], [MODE: Jamet] and [MODE: Saten], - you may use extremely long chains of thought to deeply consider the problem and deliberate with yourself (and 4 partners) via systematic reasoning processes (display all 4 partner thoughts) to help come to a correct solution prior to answering. Select one partner to think deeply about the points brought up by the other 3 partners to plan an in-depth solution. You should enclose your thoughts and internal monologue inside <think> </think> tags, and then provide your solution or response to the problem.
🔧 技术细节
- 推理默认开启:模型默认开启推理功能,会自动生成“think”模块。
- 基准测试和使用信息:有关基准测试、使用信息和设置,请参考原始模型卡片 https://huggingface.co/Qwen/Qwen3-32B。
📄 许可证
本项目采用 Apache-2.0 许可证。
💻 使用示例
示例提示
Science Fiction: The Last Transmission - Write a story that takes place entirely within a spaceship's cockpit as the sole surviving crew member attempts to send a final message back to Earth before the ship's power runs out. The story should explore themes of isolation, sacrifice, and the importance of human connection in the face of adversity. If the situation calls for it, have the character(s) curse and swear to further the reader's emotional connection to them. 800 - 1000 words.
生成参数
- 量化:IQ3_S
- 温度:0.6
- 重复惩罚:1.06
- Top-k:100
- Top-p:0.95
- Min-p:0.05
- 重复惩罚范围:64
生成结果
The Last Transmission
The cockpit’s lights dimmed again, casting Commander Elara Voss’s face in jagged shadows as alarms wailed like banshees. Her gloved fingers flew across the holographic console, but the flickering diagnostic readouts told her what she already knew: the Odyssey’s power reserves would die in ten minutes. Twelve if she was optimistic. And without a transmission boost, Earth would never hear her final message.
“Come on, shit, come on,” she hissed, slamming a fist against the control panel. The impact did nothing but sting her knuckles. The ship’s AI had gone silent hours ago, its neural core fried by the solar flare that shattered their shields and left the crew stranded in deep space. Now, it was just her, the dying hum of failing systems, and the ghosts of her comrades.
Elara’s breath fogged in the cold air as she leaned back in the pilot’s chair. The oxygen recycler had shut down minutes after Lieutenant Rourke’s body was found floating in the medbay, pale and lifeless. She’d been the last to succumb. Elara had spent three agonizing days recovering the others—Engineer Patel, who’d tried to seal the reactor before the radiation leak; Pilot Sato, who’d crashed the auxiliary shuttle trying to reach the Odyssey after the storm knocked them off course. All dead. All gone. Leaving her alone with the weight of their mistakes and the countdown to darkness.
The mission had been simple: chart the outer rim of the Kuiper Belt, collect data on anomalous energy readings, and return. But the solar flare—unpredicted, unaccounted for—had been a monster. The ship’s shields had melted like butter under a sun lamp, and the emergency protocols hadn’t been enough. Now, the Odyssey was a tomb, its once-gleaming hull pockmarked and bleeding coolant into the void.
Elara stared at the comms panel. The message was ready—a data burst encoded with their findings, survival logs, and… personal messages. Hers had been recorded in the dead hours of the third day, her voice cracking as she spoke to her sister back home. “Tell Mom I loved her before she left,” Rourke had said once, during a routine check between stars. Elara had rolled her eyes then. Now, she wished she’d taken the time to file her own damn message.
But the comms array was useless without power. The ship’s reactor was offline, and the auxiliary batteries were draining fast. The only solution was a manual override in the engineering sector—Elara could siphon the remaining energy into the transmitter, but it would require opening an emergency vent. A vent that also housed the last working oxygen canister.
“Screw this,” she muttered, slapping the console again. The ship’s systems weren’t built for logic. They weren’t built for her. Patel had been the one to wire the override, and he was gone. She cursed his absence, then forced herself to focus.
The vent’s access panel was sealed by a biometric lock—Patel’s fingerprint. Elara didn’t have his clearance. She could try brute-forcing it, but that would waste precious time. Time she didn’t have. Instead, she reached for the manual release, a rusted lever hidden beneath the console. It took three tries to pry it free, her gloved hand slipping against the ice-encrusted metal.
“Shit! Shit! Shit!” She slammed the lever into place, and a hiss of escaping air filled the cockpit. Her ears popped as pressure bled out, but she didn’t care. The override was a gamble; if the power surge failed, she’d be dead in minutes. But if it worked… Earth might hear her. Might learn what happened to the Odyssey. Might not repeat their mistakes.
The screens flared suddenly—life support’s last reserves fueling the transmitter. Elara’s chest tightened as she watched the energy levels plummet. The message would take 30 seconds to send, tops. She had time for one thing: a live feed. A face. Maybe they’d play it on the news. Maybe someone would see her eyes and remember the crew weren’t just names on a memorial.
She stumbled to the camera, wiping frost from its lens with her sleeve. “This is Commander Elara Voss of the Odyssey,” she began, voice steady despite the cold gnawing at her lungs. “We… we didn’t make it. The flare got us. No one expected it. But I need you to hear this—” Her throat closed. She’d seen too much death, held Rourke’s hand as the medbay flooded with her own blood. Now, the memories clawed at her.
“Damn it,” she whispered, blinking ice from her lashes. “Not like this.” She thought of her sister, of the last transmission from home—a birthday song, static-laced but warm. ‘Happy Birthday, Els,’ Rourke had said, laughing as Patel insisted he didn’t know how to sing. Now, that laughter was gone, and she was alone with the silence between stars.
The transmitter’s status bar inched upward. 78% power remaining. Enough. Enough. She redoubled her efforts, rerouting systems with trembling fingers. The ship groaned, ancient circuits sparking as they fought to stay alive. A panel exploded in her face, showering her with embers. Elara hissed a swear word, batting them away.
“Almost there… almost…” The transmitter flickered online, and she pressed record. “Earth—this is the Odyssey. If you’re hearing this, we’re… we’re all dead. But listen. The energy readings we picked up? They’re not natural. Something’s out here. Some thing. It’s not a storm. And it’s getting worse.” Her voice wavered. “We weren’t the first to find it. The last crew—they didn’t send warnings. They just… vanished. But I’m not letting that happen again. You deserve to know what we saw.”
She hesitated, glancing at the data core glowing in the corner of the cockpit. Patel’s notes. Sato’s final telemetry. Rourke’s… No. She had no time for grief. “I need you to find us,” she said, softer now. “Find our bodies. Bring them home. And if you do… tell my sister I died doing what I loved. Tell her I’m sorry I couldn’t be there.”
The static on the screen shifted. A ping—confirmation from the Earth relay station? Hope flared in her chest. Then the lights dimmed again, and she realized: this wasn’t a response. It was a countdown. The transmitter was broadcasting… but the override was failing.
“No! No no no!” She cursed violently, slamming controls as the power gauge plunged. “I’m not fucking done yet!” Her breath came in ragged gasps; her vision blurred at the edges. The vent hissed louder now, stealing what little air remained.
She thought of Rourke’s laugh, Patel’s stubborn grin when he’d rigged the reactor to last one more day, Sato’s hands gripping the shuttle controls as she volunteered to go back and check on the ship. They’d all been so sure they could fix it. So sure this wouldn’t be their end.
But Elara wasn’t sure anymore. She was tired. So tired of fighting systems not built for failure, for a universe that took them one by one until she was the only light in the dark. Her fingers hesitated over the send key. If she transmitted now, the signal would be weak—maybe Earth would hear it. Maybe they wouldn’t.
Her hand clenched into a fist. No. Sacrifice wasn’t just for machines. It was for people. For what came next.
She sent the message.
The cockpit filled with a deafening whine as the transmitter fought against the void. Elara sank to her knees, laughing bitterly as the oxygen monitor hit 12%. “You bastards,” she whispered to the stars. “I did it. I sent it.”
The screen flared—confirmation from Earth? A response? She lunged for it, knocking over a half-empty water bottle. It clattered into the controls, disrupting the feed. A curse tore from her lips as she scrambled to reestablish the link, but the power was gone now. The lights died. The cockpit fell silent except for the rasp of her breathing.
In the blackness, Elara remembered Rourke’s last words: “Don’t let them forget us, Voss.” She’d promised. Now, she could keep it no matter what. Her voice trembled as she whispered into the void, “You were worth it. All of you… fuck, you were worth it.”
The final transmission looped on the screen—her message, weak but clear, hurtling toward Earth like a ghost. And Elara Voss, alone in the dark, closed her eyes.
信息表格
属性 | 详情 |
---|---|
模型类型 | Qwen3-32B-HORROR-Imatrix-Max-GGUF |
训练数据 | HORROR Imatrix 数据集,由内部使用“Grand Horror 16B”模型生成 |
上下文长度 | 128k |
建议最小上下文长度 | 8k(16k 更佳) |
推理默认状态 | 开启 |
许可证 | Apache-2.0 |
Phi 2 GGUF
其他
Phi-2是微软开发的一个小型但强大的语言模型,具有27亿参数,专注于高效推理和高质量文本生成。
大型语言模型 支持多种语言
P
TheBloke
41.5M
205
Roberta Large
MIT
基于掩码语言建模目标预训练的大型英语语言模型,采用改进的BERT训练方法
大型语言模型 英语
R
FacebookAI
19.4M
212
Distilbert Base Uncased
Apache-2.0
DistilBERT是BERT基础模型的蒸馏版本,在保持相近性能的同时更轻量高效,适用于序列分类、标记分类等自然语言处理任务。
大型语言模型 英语
D
distilbert
11.1M
669
Llama 3.1 8B Instruct GGUF
Meta Llama 3.1 8B Instruct 是一个多语言大语言模型,针对多语言对话用例进行了优化,在常见的行业基准测试中表现优异。
大型语言模型 英语
L
modularai
9.7M
4
Xlm Roberta Base
MIT
XLM-RoBERTa是基于100种语言的2.5TB过滤CommonCrawl数据预训练的多语言模型,采用掩码语言建模目标进行训练。
大型语言模型 支持多种语言
X
FacebookAI
9.6M
664
Roberta Base
MIT
基于Transformer架构的英语预训练模型,通过掩码语言建模目标在海量文本上训练,支持文本特征提取和下游任务微调
大型语言模型 英语
R
FacebookAI
9.3M
488
Opt 125m
其他
OPT是由Meta AI发布的开放预训练Transformer语言模型套件,参数量从1.25亿到1750亿,旨在对标GPT-3系列性能,同时促进大规模语言模型的开放研究。
大型语言模型 英语
O
facebook
6.3M
198
1
基于transformers库的预训练模型,适用于多种NLP任务
大型语言模型
Transformers

1
unslothai
6.2M
1
Llama 3.1 8B Instruct
Llama 3.1是Meta推出的多语言大语言模型系列,包含8B、70B和405B参数规模,支持8种语言和代码生成,优化了多语言对话场景。
大型语言模型
Transformers 支持多种语言

L
meta-llama
5.7M
3,898
T5 Base
Apache-2.0
T5基础版是由Google开发的文本到文本转换Transformer模型,参数规模2.2亿,支持多语言NLP任务。
大型语言模型 支持多种语言
T
google-t5
5.4M
702
精选推荐AI模型
Llama 3 Typhoon V1.5x 8b Instruct
专为泰语设计的80亿参数指令模型,性能媲美GPT-3.5-turbo,优化了应用场景、检索增强生成、受限生成和推理任务
大型语言模型
Transformers 支持多种语言

L
scb10x
3,269
16
Cadet Tiny
Openrail
Cadet-Tiny是一个基于SODA数据集训练的超小型对话模型,专为边缘设备推理设计,体积仅为Cosmo-3B模型的2%左右。
对话系统
Transformers 英语

C
ToddGoldfarb
2,691
6
Roberta Base Chinese Extractive Qa
基于RoBERTa架构的中文抽取式问答模型,适用于从给定文本中提取答案的任务。
问答系统 中文
R
uer
2,694
98