G

Gpt2 Large Japanese

Developed by abeja
A large Japanese GPT-2 model trained by ABEJA, supporting Japanese text generation tasks
Downloads 960
Release Time : 8/29/2022

Model Overview

This is a large Japanese language model based on the GPT-2 architecture, specifically optimized for Japanese text generation tasks.

Model Features

Japanese-specific Model
Specially trained and optimized for Japanese text
Diverse Generation
Supports various sampling strategies for diverse text generation
Rich Pretraining Data
Utilizes multiple high-quality datasets including Japanese CC-100, Japanese Wikipedia, and Japanese OSCAR

Model Capabilities

Japanese Text Generation
Context Understanding
Diverse Text Sampling

Use Cases

Content Creation
Article Continuation
Continue writing a complete article based on a given opening
Generates fluent and coherent Japanese text
AI-assisted Writing
Creative Writing
Assists writers in creative ideation and content generation
Provides diverse writing ideas
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase