AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Self-Attention Distillation

# Self-Attention Distillation

Nasa Smd Ibm Distil V0.1
Apache-2.0
INDUS-Small is a distilled version of the RoBERTa-based encoder-only Transformer model INDUS, specifically adapted for NASA Science Mission Directorate (SMD) applications and fine-tuned for NASA SMD-related scientific journals and articles.
Large Language Model Transformers English
N
nasa-impact
13
8
Xlm Roberta Comet Small
mMiniLM-L12xH384 XLM-R is a lightweight multilingual pre-trained model based on the MiniLMv2 architecture, compressed from the traditional XLM-RoBERTa model through relational distillation technology.
Large Language Model Transformers
X
Unbabel
45
2
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase