S

SEA LION V1 7B

Developed by aisingapore
SEA-LION-v1-7B is a 7B-parameter large language model optimized for Southeast Asia, supporting 11 Southeast Asian languages.
Downloads 451
Release Time : 10/30/2023

Model Overview

The SEA-LION series of large language models are specifically designed for Southeast Asian languages and cultural contexts. They are built based on the MPT architecture and use a customized tokenizer to optimize multilingual performance.

Model Features

Optimized for Southeast Asian languages
Specifically trained for 11 Southeast Asian languages, including a customized tokenizer and regional context understanding.
Large-scale training data
Trained on 980 billion tokens of multilingual data, including programming languages and academic texts.
High-performance architecture
Based on the MPT architecture, a 4096-dimensional model with 32 layers, 256K vocabulary, and a 2048 sequence length.

Model Capabilities

Multilingual text understanding
Southeast Asian context processing
Programming language understanding
Academic text processing

Use Cases

Multilingual applications
Southeast Asian language translation
Supports mutual translation between Southeast Asian languages.
Regional content generation
Generates content that conforms to the Southeast Asian cultural context.
Technical document processing
Code understanding and generation
Processes programming languages such as Python and JavaScript.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase