K

Km Bert

Developed by madatnlp
A pre-trained BERT model for Korean medical NLP, based on the KR-BERT architecture, pre-trained on a 116-million-word Korean medical corpus
Downloads 241
Release Time : 7/4/2023

Model Overview

BERT model specifically designed for the Korean medical domain, optimized for medical terminology understanding and semantic analysis

Model Features

Medical Domain Optimization
Pre-trained using three types of professional corpora: medical textbooks, health information, and research papers
Korean Language Adaptation
Architecture customized for Korean-specific word order characteristics
Multi-task Evaluation
Language understanding validated through MLM/NSP tasks and MedSTS dataset

Model Capabilities

Medical text understanding
Semantic similarity calculation
Named Entity Recognition
Korean medical terminology processing

Use Cases

Medical Information Processing
Medical Literature Analysis
Extracting key information from Korean medical research papers
Outperforms baseline models in NER tasks
Health Consultation Semantic Matching
Matching patient inquiries with medical knowledge base content
Excellent performance on MedSTS dataset
Clinical Support Systems
Electronic Medical Record Processing
Extracting structured information from Korean electronic medical records
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase