AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Korean language processing

# Korean language processing

Distilkobert
Apache-2.0
DistilKoBERT is a lightweight version of the Korean BERT model, compressed through knowledge distillation technology, retaining most of the performance while reducing computational resource requirements.
Large Language Model Transformers Korean
D
monologg
17.02k
5
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase