L

Llm Jp 13b Instruct Full Jaster Dolly Oasst V1.0

Developed by llm-jp
A large-scale language model developed by the Japanese LLM-jp project, supporting text generation tasks in Japanese and English
Downloads 750
Release Time : 10/18/2023

Model Overview

This is a large-scale language model based on the Transformer architecture, fine-tuned for instructions, focusing on text generation tasks in Japanese and English. The model was developed by the Japanese LLM-jp project, pre-trained on 300 billion tokens of data, and fine-tuned on multiple instruction datasets.

Model Features

Multilingual support
Optimized specifically for Japanese and English, with strong performance in both languages
Large-scale pre-training
Pre-trained on a diverse dataset of 300 billion tokens
Instruction fine-tuning
Fine-tuned on multiple high-quality instruction datasets to improve instruction-following capabilities
Efficient inference
Supports half-precision floating-point operations (torch.float16) to enhance inference efficiency

Model Capabilities

Japanese text generation
English text generation
Instruction following
Question answering system

Use Cases

Education
Language learning assistance
Helps students understand and generate text in Japanese and English
Customer service
Automated Q&A system
Build customer service chatbots in Japanese and English
Content creation
Multilingual content generation
Assists creators in generating text content in Japanese and English
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase