S

Stockmark 2 100B Instruct Beta

Developed by stockmark
Stockmark-2-100B is a 100-billion parameter large language model focused on Japanese capabilities, pre-trained on 1.5 trillion tokens of multilingual data and enhanced with Japanese synthetic data for improved instruction following.
Downloads 1,004
Release Time : 3/5/2025

Model Overview

This is a beta version of a large language model, specially optimized for Japanese processing and enhanced with instruction fine-tuning for better user interaction.

Model Features

Large-scale parameters
With 100 billion parameters, it possesses powerful language understanding and generation capabilities.
Japanese optimization
Specially focused on Japanese capabilities, with 30% of training data in Japanese.
Instruction following
Enhanced training for instruction following using Japanese synthetic data.
Multilingual support
Supports Japanese and English, with training data consisting of 60% English and 30% Japanese.

Model Capabilities

Japanese text generation
English text generation
Instruction understanding and execution
Multi-turn dialogue

Use Cases

Language processing
Japanese Q&A system
Building intelligent Q&A applications for Japanese users
Multilingual content generation
Generating text content in Japanese and English
Education
Japanese learning assistant
Helping learners practice Japanese conversation and writing
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase