W

Wanabi 24b V1 GGUF

Developed by kawaimasa
A large-scale language model fine-tuned specifically for Japanese novel writing support
Downloads 274
Release Time : 5/3/2025

Model Overview

A Japanese LLM based on continuous learning from Mistral-Small-24B-Base-2501, focusing on supporting all aspects of novel creation

Model Features

Specialized optimization for novel writing
Fine-tuned specifically for Japanese novel writing scenarios
Continuous learning version
Version v0.3 has 4 times the learning volume of v0.1
Quantized model support
Provides various GGUF quantization formats (Q6_K, Q5_K_M, etc.) with unique calibration data

Model Capabilities

Japanese text generation
Novel plot conception
Main content generation
Contextually coherent continuation

Use Cases

Literary creation
Novel conception
Assists writers in generating ideas and story frameworks
Main text generation
Generates coherent novel paragraphs based on prompts
Content continuation
Generates contextually appropriate follow-up developments based on existing content
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase