Kobart
K
Kobart
Developed by hyunwoongko
KoBART-base-v2 is a Korean pre-trained model based on the BART architecture, optimized for long-sequence semantic processing through the addition of chat data.
Downloads 72.70k
Release Time : 3/2/2022
Model Overview
This model is a Korean version of the BART architecture, specifically optimized for Korean text processing and capable of handling longer sequence semantics than the original KoBART.
Model Features
Long sequence processing optimization
Enhanced ability to handle long-sequence semantics through training with chat data.
Post-processor improvements
Added bos/eos post-processor and removed token_type_ids, improving processing efficiency.
Model Capabilities
Korean text generation
Korean text understanding
Long sequence semantic processing
Use Cases
Text processing
Sentiment analysis
Used for sentiment analysis tasks on Korean text
Achieved an accuracy of 0.901 on the NSMC dataset
Chat applications
Suitable for text generation in Korean chatbots
Featured Recommended AI Models