Kogpt2
KoGPT2 is a Korean generative pre-trained model based on the Huggingface Transformers framework, developed and open-sourced by SKT-AI.
Downloads 1,978
Release Time : 3/2/2022
Model Overview
KoGPT2 is a Korean language model based on the GPT-2 architecture, suitable for various natural language processing tasks such as text generation and dialogue systems.
Model Features
Optimized for Korean
Specifically optimized for Korean text, with high generation quality.
Transformers compatibility
Compatible with the Huggingface Transformers framework, facilitating integration and use.
Suitable for multiple scenarios
Supports multiple application scenarios such as dialogue generation and text continuation.
Model Capabilities
Korean text generation
Dialogue system
Text continuation
Use Cases
Dialogue system
Daily dialogue chatbot
Can be used to build a Korean chatbot
See the demonstration example: http://demo.tmkor.com:36200/dialo
Content generation
Cosmetics review generation
Generate review content related to cosmetics
See the demonstration example: http://demo.tmkor.com:36200/ctrl
Featured Recommended AI Models