AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Efficient Small Model

# Efficient Small Model

Falcon H1 1.5B Deep Base
Other
Falcon-H1 is an efficient hybrid architecture language model developed by TII, combining Transformer and Mamba architectures to support multilingual tasks
Large Language Model Transformers Supports Multiple Languages
F
tiiuae
194
3
Tiny Llava V1 Hf
Apache-2.0
TinyLLaVA is a compact large-scale multimodal model framework focused on vision-language tasks, featuring small parameter size yet excellent performance.
Image-to-Text Transformers Supports Multiple Languages
T
bczhou
2,372
57
Rocket 3B
Rocket-3B is a 3-billion-parameter large language model trained on public datasets through Direct Preference Optimization (DPO), outperforming many larger-scale models.
Large Language Model Transformers English
R
pansophic
26
85
It5 Efficient Small El32 News Summarization
Apache-2.0
Italian news summarization model based on IT5 Efficient Small EL32 architecture, fine-tuned on Fanpage and Il Post datasets
Text Generation Other
I
gsarti
97
4
Roberta Small
An efficient small neural language model benchmarking platform designed for single-GPU training
Large Language Model Transformers
R
smallbenchnlp
172
1
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase