AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Optimization of small parameter quantity

# Optimization of small parameter quantity

Turkish Deepseek
Apache-2.0
A language model trained on Turkish text based on the DeepSeek architecture, incorporating Multi-Head Latent Attention (MLA) and Mixture of Experts (MoE) technologies.
Large Language Model Transformers Other
T
alibayram
106
1
Deberta V3 Small Finetuned Mnli
MIT
A small version of DeBERTa v3 fine-tuned on the GLUE MNLI dataset for natural language inference tasks, with an accuracy of 87.46%
Text Classification Transformers English
D
mrm8488
139
3
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase