Kanana 1.5 2.1b Instruct 2505
Kanana 1.5 is a bilingual large language model developed by Kakao, with significant improvements in programming, mathematics, and function calling capabilities. It supports a 32K token context length and can process ultra-long texts up to 128K tokens using YaRN extension technology.
Downloads 1,361
Release Time : 5/21/2025
Model Overview
Kanana 1.5 is an efficient bilingual large language model focused on enhancing programming, mathematics, and function calling capabilities, suitable for complex scenarios and long-text processing.
Model Features
Enhanced Programming Capabilities
Excellent performance in programming benchmarks such as HumanEval and MBPP.
Mathematical Reasoning
High scores in mathematical benchmarks like GSM8K and MATH.
Long-Text Processing
Native support for 32K token context length, extendable to 128K tokens via YaRN.
Bilingual Support
Supports both English and Korean processing.
Optimized Dialogue Capabilities
Post-training optimization results in more natural and precise conversational responses.
Model Capabilities
Text Generation
Code Generation
Mathematical Problem Solving
Long-Text Comprehension
Bilingual Processing
Instruction Following
Use Cases
Programming Assistance
Code Generation
Generate code based on natural language descriptions.
Achieved 68.90 points in the HumanEval test.
Code Completion
Auto-complete code based on context.
Achieved 65.08 points in the MBPP+ test.
Mathematical Applications
Mathematical Problem Solving
Solve complex mathematical problems.
Achieved 81.43 points in the GSM8K zero-shot test.
Dialogue Systems
Intelligent Customer Service
Provide natural and fluent bilingual customer service dialogues.
Achieved 7.01 points in the MT-Bench test.
Featured Recommended AI Models
Š 2025AIbase