Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Korean language processing
# Korean language processing
Distilkobert
Apache-2.0
DistilKoBERT is a lightweight version of the Korean BERT model, compressed through knowledge distillation technology, retaining most of the performance while reducing computational resource requirements.
Large Language Model
Transformers
Korean
D
monologg
17.02k
5
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase