K

Ko Gpt Trinity 1.2B V0.5

Developed by skt
A 1.2 billion parameter Korean Transformer model based on the GPT-3 architecture, developed by SK Telecom, primarily used for Korean text generation and comprehension tasks.
Downloads 1,294
Release Time : 3/2/2022

Model Overview

This model is a Transformer designed based on the GPT-3 architecture, capable of extracting features suitable for downstream tasks by learning internal representations of Korean. Its core strength lies in generating coherent text based on prompts.

Model Features

Korean optimization
The model is primarily trained on Korean text, making it most suitable for Korean text classification, retrieval, summarization, or generation tasks.
Large-scale pretraining
Trained on Ko-DAT, a large-scale curated dataset built by SK Telecom specifically for training purposes, processing 35 billion tokens.
High-performance inference
Outperforms models like KoElectra-base and KoBERT-base on inference tasks such as BoolQ, CoPA, and WiC.

Model Capabilities

Korean text generation
Korean text classification
Korean text retrieval
Korean text summarization

Use Cases

Text generation
Content creation
Generates coherent Korean text based on prompts, suitable for content creation and writing assistance.
Text comprehension
Q&A systems
Used to build Korean Q&A systems to answer user questions.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase