Distilkobert
D
Distilkobert
Developed by monologg
DistilKoBERT is a lightweight version of the Korean BERT model, compressed through knowledge distillation technology, retaining most of the performance while reducing computational resource requirements.
Downloads 17.02k
Release Time : 3/2/2022
Model Overview
This model is a lightweight BERT model specifically optimized for the Korean language, suitable for various Korean natural language processing tasks.
Model Features
Lightweight Design
Compresses the original model through knowledge distillation technology, reducing computational resource requirements
Korean Language Optimization
Specifically optimized for the characteristics of the Korean language
Compatible with Hugging Face Ecosystem
Can be loaded and used via the transformers library
Model Capabilities
Korean text understanding
Korean text classification
Korean named entity recognition
Use Cases
Natural Language Processing
Korean Text Classification
Classifying Korean text
Korean Sentiment Analysis
Analyzing the sentiment tendency of Korean text
Featured Recommended AI Models
Š 2025AIbase