J

Japanese Gpt Neox Small

Developed by rinna
A small Japanese language model based on GPT-NeoX architecture, supporting text generation tasks
Downloads 838
Release Time : 8/31/2022

Model Overview

This is a small Japanese language model trained on the GPT-NeoX architecture, primarily designed for Japanese text generation tasks. The model employs a 12-layer Transformer architecture with 768 hidden units.

Model Features

Japanese Optimization
Specially trained and optimized for Japanese text
Prefix Tuning Support
Provides demonstration weights supporting prefix tuning functionality to control generated text style
Efficient Inference
Supports NVIDIA FasterTransformer for efficient inference

Model Capabilities

Japanese Text Generation
Prefix Tuning Controlled Generation
Language Modeling

Use Cases

Content Generation
Automatic Japanese Text Generation
Can be used to generate Japanese articles, comments, and other content
Generates text that conforms to Japanese grammar and expression conventions
Style-Controlled Generation
Text Generation with Emoji
Uses prefix tuning demonstration weights to generate text with specific emojis
Automatically adds ๐Ÿ˜ƒ emoji at the end of generated text
Featured Recommended AI Models
ยฉ 2025AIbase