AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
ReLU Activation Optimization

# ReLU Activation Optimization

Minicpm S 1B Sft
Apache-2.0
MiniCPM-S-1B-sft is a 1B-parameter language model optimized with activation sparsity techniques, achieving high-sparsity inference acceleration through the ProSparse method while maintaining performance comparable to the original model.
Large Language Model Transformers Supports Multiple Languages
M
openbmb
169
10
Relullama 7B
A ReLU-activated sparse large language model fine-tuned based on Llama 2 7B, improving computational efficiency through dynamic parameter selection
Large Language Model Transformers English
R
SparseLLM
5,323
11
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase