AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Text inference

# Text inference

Distilbart Mnli 12 9
DistilBart-MNLI is a lightweight version distilled from bart-large-mnli using teacher-free distillation technology, maintaining high accuracy while reducing model complexity.
Text Classification
D
valhalla
8,343
12
Ko Gpt Trinity 1.2B V0.5
A 1.2 billion parameter Korean Transformer model based on the GPT-3 architecture, developed by SK Telecom, primarily used for Korean text generation and comprehension tasks.
Large Language Model Transformers Korean
K
skt
1,294
44
Distilbart Mnli 12 1
DistilBart-MNLI is a distilled version obtained from bart-large-mnli through teacher-free distillation technology, maintaining high accuracy while being more compact.
Text Classification
D
valhalla
217.48k
52
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase