K

Kobert Lm

Developed by monologg
KoBERT-LM is a pre-trained language model optimized for Korean, based on the BERT architecture and further pre-trained specifically for Korean text.
Downloads 49
Release Time : 3/2/2022

Model Overview

KoBERT-LM is a Korean pre-trained language model based on the BERT architecture, optimized for masked language modeling tasks through further pre-training.

Model Features

Korean Optimization
Further pre-trained specifically for Korean text, optimizing Korean language understanding capabilities.
Based on BERT Architecture
Adopts the BERT architecture, inheriting the powerful language representation capabilities of the BERT model.
Masked Language Modeling
Optimized for masked language modeling tasks, suitable for tasks requiring masked word prediction.

Model Capabilities

Korean Text Understanding
Masked Word Prediction

Use Cases

Natural Language Processing
Korean Text Classification
Can be used for tasks such as sentiment analysis and topic classification of Korean text.
Korean Question Answering System
Can be used to build a Korean question answering system, understanding and answering Korean questions.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase