Kobert
K
Kobert
Developed by monologg
KoBERT is a Korean pre-trained language model based on the BERT architecture, suitable for various Korean natural language processing tasks.
Downloads 1.2M
Release Time : 3/2/2022
Model Overview
KoBERT is a Korean pre-trained language model based on the BERT architecture, developed by SKT Brain. This model is specifically optimized for Korean and can be used for natural language processing tasks such as text classification and named entity recognition.
Model Features
Korean optimization
Specifically optimized for Korean language characteristics, enabling better processing of Korean text.
Based on BERT architecture
Adopts the BERT architecture, providing strong contextual understanding capabilities.
Pre-trained model
Pre-trained on large-scale Korean corpora, ready for direct use in downstream tasks.
Model Capabilities
Korean text understanding
Text classification
Named entity recognition
Use Cases
Natural language processing
Korean text classification
Classify Korean text, such as sentiment analysis, topic classification, etc.
Korean named entity recognition
Identify entities such as person names, place names, and organization names in Korean text.
Featured Recommended AI Models
Š 2025AIbase