M

Mpt 7b Storywriter

Developed by mosaicml
A fiction generation model designed for long-form text reading and creation, supporting 65k+ tokens context length
Downloads 769
Release Time : 5/4/2023

Model Overview

A long-form text generation model fine-tuned based on MPT-7B, focusing on novel writing and long-text comprehension, using ALiBi technology to break through context length limitations

Model Features

Extended context processing
Supports 65k tokens context window, extendable to 84k+ tokens during inference
ALiBi position encoding
Utilizes linear bias attention technology for dynamic context expansion
Efficient training optimization
Integrates FlashAttention, QK layer normalization and other technologies to enhance training efficiency
Business-friendly license
Apache 2.0 license permits commercial use

Model Capabilities

Long-form text generation
Fiction writing
Story continuation
Extended text comprehension

Use Cases

Creative writing
Automatic novel generation
Generates complete novel content based on opening paragraphs
Demonstrated ability to generate coherent texts up to 84k tokens
Story continuation
Generates follow-up plots based on classic literary works (e.g., 'The Great Gatsby')
The model can maintain the original writing style and generate reasonable endings
Long-text analysis
Extended document processing
Content analysis and summary generation for entire novels
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase