L

Llama 3 Youko 8b

Developed by rinna
A Japanese-optimized model based on Meta-Llama-3-8B, continuously pretrained on a mixed dataset of Japanese and English with 22 billion tokens
Downloads 1,249
Release Time : 5/1/2024

Model Overview

This model significantly improves performance on Japanese tasks through continued pretraining of Llama 3 8B, suitable for Japanese text generation and comprehension tasks

Model Features

Japanese Optimization
Continued pretraining on 22 billion tokens of Japanese corpus significantly enhances Japanese processing capabilities
Multi-source Data Training
Incorporates various high-quality datasets including CC-100, C4, OSCAR, The Pile, and Wikipedia
Bilingual Support
Supports both Japanese and English processing, suitable for bilingual application scenarios

Model Capabilities

Japanese Text Generation
English Text Generation
Text Comprehension
In-context Learning

Use Cases

Content Creation
Japanese Article Generation
Generates various articles that conform to Japanese expression conventions
Produces fluent and natural Japanese text
Education
Japanese Learning Assistance
Helps Japanese learners generate example sentences or explain grammar
Featured Recommended AI Models
ยฉ 2025AIbase