L

Llm Jp 3 1.8b Instruct

Developed by llm-jp
A large language model developed by the National Institute of Informatics in Japan, supporting Japanese and English, with instruction fine-tuning capabilities.
Downloads 2,759
Release Time : 9/23/2024

Model Overview

This is a large language model based on Transformer, focusing on text generation and understanding tasks in Japanese and English, pre-trained on multilingual datasets and fine-tuned with instructions.

Model Features

Multilingual support
Supports text generation and understanding tasks in Japanese and English.
Instruction fine-tuning
Fine-tuned on various instruction datasets to better understand and execute user instructions.
High-performance tokenizer
A tokenizer based on the Unigram byte fallback model, supporting efficient text processing.

Model Capabilities

Text generation
Instruction understanding
Multilingual processing

Use Cases

Natural language processing
Japanese Q&A system
Used to build a Japanese Q&A system to answer users' questions about natural language processing.
Performs well in the llm-jp-eval evaluation.
Code generation
Multilingual code generation
Supports code generation in multiple programming languages, including Python, Java, etc.
Performs well on the synthetic instruction dataset.
Featured Recommended AI Models
ยฉ 2025AIbase