B

Bert Small Kor V1

Developed by bongsoo
Korean foundational model based on the BERT architecture, trained using Korean text data from the AI Hub web corpus (approximately 52 million texts)
Downloads 41
Release Time : 12/28/2022

Model Overview

This is a Korean foundational model based on the BERT architecture, primarily used for masked language modeling tasks, supporting both Korean and English.

Model Features

Korean Optimization
Trained using Korean text data from the AI Hub web corpus (approximately 52 million texts), specifically optimized for Korean
BERT Architecture
Based on the BERT-base architecture, with strong language understanding capabilities
Multi-task Training
Simultaneously trained with NSP (Next Sentence Prediction) and MLM (Masked Language Modeling)

Model Capabilities

Masked Language Modeling
Korean Text Understanding
English Text Understanding

Use Cases

Text Completion
Capital Name Prediction
Predict the missing capital name in a sentence
Example input: 'The capital of South Korea is [MASK]', predicted result: 'Seoul'
Historical Figure Identification
Identify missing historical figure information in a sentence
Example input: 'Admiral Yi Sun-sin was the most outstanding general of the [MASK] era', predicted result: '' (no valid prediction provided)
Featured Recommended AI Models
ยฉ 2025AIbase