Q

Qwen3 30B A3B ERP V0.1

Developed by Aratako
A role-play specialized large language model fine-tuned from Qwen3-30B-A3B-NSFW-JP, supporting long Japanese text generation
Downloads 68
Release Time : 5/7/2025

Model Overview

A 30B-parameter large model optimized for role-playing scenarios, supporting 32768 tokens long context with MOE architecture and expert parallel technology

Model Features

Long Context Support
Supports ultra-long context memory of 32768 tokens, suitable for complex role-playing scenarios
MOE Architecture Optimization
Utilizes Mixture of Experts architecture with 4 expert groups for parallel processing, improving inference efficiency
Role-Play Specialized Templates
Built-in role-playing dialogue templates supporting structured input of worldviews/character settings

Model Capabilities

Japanese long text generation
Structured role-playing
Multi-turn dialogue maintenance
Stylized response generation

Use Cases

Entertainment Applications
Fantasy World Role-Playing
Interactive character dialogues in medieval magical world settings
Generates stylized responses consistent with character settings
Virtual Character Chat
Multi-turn natural conversations with customized virtual characters
Maintains consistent character persona throughout interactions
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase