L

Llm Jp 3.1 1.8b

Developed by llm-jp
LLM-jp-3.1-1.8b is a large language model developed by the National Institute of Informatics in Japan. Based on the LLM-jp-3 series, it incorporates instruction pre-training to enhance the instruction-following ability.
Downloads 572
Release Time : 5/27/2025

Model Overview

LLM-jp-3.1-1.8b is a large language model based on the Transformer architecture, supporting multilingual processing. It is specifically optimized for the instruction-following ability in Japanese and English.

Model Features

Instruction pre-training enhancement
Instruction pre-training is incorporated in the middle of training, significantly improving the model's instruction-following ability
Multilingual support
Supports the processing of multiple languages such as Japanese, English, Chinese, and Korean
Optional parameter scale
Provides model versions with different parameter scales to meet different computing requirements

Model Capabilities

Japanese text generation
English text generation
Multilingual translation
Instruction understanding and execution
Code generation

Use Cases

Natural language processing
Japanese question - answering system
Build an intelligent question - answering application based on Japanese
Scored 6.30 in the Japanese MT Bench evaluation
Multilingual translation
Supports translation between Japanese and languages such as English and Chinese
Code assistance
Code generation
Generate code snippets based on natural language descriptions
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase