P

Plamo 2 8b

Developed by pfnet
PLaMo 2 8B is an 8-billion-parameter hybrid architecture language model developed by Preferred Elements, supporting English and Japanese text generation.
Downloads 401
Release Time : 2/7/2025

Model Overview

A large-scale language foundation model pre-trained on English and Japanese datasets, employing a Samba-like hybrid architecture (combining selective state space models with sliding window attention mechanisms), focusing on efficient text generation.

Model Features

Efficient Hybrid Architecture
Integrates Mamba2 selective state space models with sliding window attention mechanisms, offering higher computational efficiency compared to traditional Transformers.
Bilingual Support
Optimized for English and Japanese, with training data comprising 6 trillion tokens (45% English / 30% Japanese).
Business-Friendly License
Organizations with annual revenue below 1 billion JPY can apply for commercial use licenses (registration required).
Enhanced Training Stability
Added normalization layers and improved Mamba2 core for better stability in large-scale training.

Model Capabilities

English Text Generation
Japanese Text Generation
Code Generation (Limited Support)
Open-domain Q&A

Use Cases

Content Creation
Multilingual Content Generation
Automatically generate English/Japanese marketing copy, blog posts, etc.
Enterprise Applications
Internal Knowledge Processing
Document summarization, report generation, and other non-commercial internal organizational uses.
Must comply with revenue restriction clauses.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase