W

Watashiha Gpt 6b

Developed by watashiha
A Japanese Daikiri language model developed based on the GPT2 architecture, pre-trained and fine-tuned with Daikiri data.
Downloads 1,831
Release Time : 12/28/2023

Model Overview

This is a language model specifically designed for generating Daikiri (a traditional Japanese word game) content. It is developed based on the GPT2 architecture, pre-trained with a large amount of Japanese corpora and fine-tuned with 6.93 million Daikiri data entries.

Model Features

Professional Daikiri Generation
Optimized specifically for Daikiri content, capable of generating interesting answers that comply with the rules of the traditional Japanese word game.
Large-scale Pre-training
Pre-trained with 47.7 billion tokens of Japanese corpora, including datasets such as C4, CC-100, OSCAR, and Wikipedia.
Professional Fine-tuning
Fine-tuned with 6.93 million Daikiri data entries to improve the relevance and่ถฃๅ‘ณๆ€ง of the generated content.
AWS Optimization
Developed using AWS's trn1 instances and optimized for cloud environments.

Model Capabilities

Japanese Text Generation
Daikiri Content Creation
Humorous Text Generation

Use Cases

Entertainment Applications
Daikiri Game
Used to generate interesting answers that comply with the rules of the Daikiri game.
In the evaluation, approximately 44% of the answers were rated as 3-star (with a certain level of ่ถฃๅ‘ณๆ€ง or higher).
Content Creation
Provide creative content for comedy shows or social media.
Featured Recommended AI Models
ยฉ 2025AIbase