Kobert Base V1
KoBERT is a BERT model specifically optimized for Korean, developed by SKT Brain and trained on Korean corpora.
Downloads 92.83k
Release Time : 3/2/2022
Model Overview
KoBERT is a pre-trained Korean language model based on the Transformer architecture, optimized for Korean grammar and vocabulary characteristics, suitable for various Korean natural language processing tasks.
Model Features
Korean Optimization
Specifically trained and optimized for Korean grammar and vocabulary characteristics
Pre-trained Model
Provides pre-trained weights, supporting fine-tuning for various downstream tasks
High Efficiency
Outperforms general BERT models in Korean NLP tasks
Model Capabilities
Korean text understanding
Korean text classification
Korean named entity recognition
Korean question answering systems
Korean text generation
Use Cases
Text Analysis
Sentiment Analysis
Analyze the sentiment tendency of Korean text
Achieves over 90% accuracy in Korean sentiment analysis tasks
Information Extraction
Named Entity Recognition
Identify entities such as person names, place names, and organization names in Korean text
Achieves an F1 score of 85% in Korean NER tasks
Featured Recommended AI Models