G

GECKO 7B

Developed by kifai
GECKO is a 7-billion-parameter decoder-only Transformer model trained on Korean, English, and code, released under the Apache 2.0 license.
Downloads 43
Release Time : 5/27/2024

Model Overview

GECKO is a generative language model trained on Korean, English, and code, utilizing 200 billion tokens and a terabyte-scale Korean corpus, suitable for multilingual text generation tasks.

Model Features

Multilingual Support
Supports text generation tasks in Korean and English.
Code Understanding
Capable of understanding and generating code snippets.
Long Context Processing
Supports context lengths of up to 8k tokens.

Model Capabilities

Korean Text Generation
English Text Generation
Code Explanation
Multilingual Translation

Use Cases

Code Assistance
Code Explanation
Explain the functionality of HTML code and provide instructions.
Accurately explains code functionality and provides expected output.
Multilingual Applications
Korean-English Translation
Translate text between Korean and English.
Featured Recommended AI Models
ยฉ 2025AIbase