Fireblossom 32K 7B
A 7B-parameter language model merged from Mistral 7B v0.1, combining multiple fine-tuned models via task arithmetic, supporting 32K context length, balancing creativity and reasoning
Downloads 21
Release Time : 4/14/2024
Model Overview
This is a 7B-parameter language model merged using mergekit, based on Mistral 7B v0.1, combining multiple fine-tuned models with strong narrative role-playing and reasoning capabilities, aiming to provide diverse text outputs
Model Features
Extended Context Length
Supports 32K context length by adjusting rope theta parameters
Diverse Outputs
Balances creativity and accuracy to provide more diverse text generation
Multi-Model Advantage Fusion
Integrates multiple excellent fine-tuned models with strong narrative role-playing and reasoning capabilities
Model Capabilities
Long Text Generation
Role-Playing Dialogue
Reasoning Task Processing
Instruction Following
Use Cases
Creative Writing
Story Creation
Generate long stories with coherent plots
Can maintain coherent narratives within 32K context
Dialogue Systems
Role-Playing Chat
Engage in role-playing style dialogue interactions
Exhibits vivid character personalities and dialogues
Featured Recommended AI Models
Š 2025AIbase