# Bilingual Instruction Optimization
Kanana 1.5 2.1b Instruct 2505
Apache-2.0
Kanana 1.5 is a bilingual large language model developed by Kakao, with significant improvements in programming, mathematics, and function calling capabilities. It supports a 32K token context length and can process ultra-long texts up to 128K tokens using YaRN extension technology.
Large Language Model
Transformers Supports Multiple Languages

K
kakaocorp
1,361
13
Kanana 1.5 8b Instruct 2505
Apache-2.0
Kanana 1.5 is a new version of the Kanana model family, with significant improvements in coding, math, and function calling capabilities compared to its predecessor, supporting 32K tokens length and extendable to 128K tokens.
Large Language Model
Transformers Supports Multiple Languages

K
kakaocorp
1,932
26
Featured Recommended AI Models