B

Bert Base

Developed by klue
A Korean-pretrained BERT model developed by the KLUE benchmark team, supporting various Korean understanding tasks
Downloads 129.68k
Release Time : 3/2/2022

Model Overview

This model is a Korean-pretrained language model based on the Transformer architecture, specifically designed for Korean natural language processing tasks such as topic classification, semantic similarity calculation, and named entity recognition

Model Features

Korean Optimization
Specially optimized for Korean language characteristics, using morpheme-based subword tokenization
Multi-source Data Training
Integrates five public Korean corpora, covering diverse topics and writing styles
Comprehensive Evaluation
Systematically evaluated on multiple tasks of the KLUE benchmark

Model Capabilities

Topic Classification
Semantic Text Similarity Calculation
Natural Language Inference
Named Entity Recognition
Relation Extraction
Dependency Parsing
Machine Reading Comprehension
Dialogue State Tracking

Use Cases

Text Analysis
News Classification
Automatic classification of Korean news articles
Semantic Search
Improving relevance in Korean search engines
Information Extraction
Entity Recognition
Extracting entities such as person names and locations from Korean text
Entity F1 score 83.97
Featured Recommended AI Models
ยฉ 2025AIbase