MiniCPM-S-1B-sft is a 1B-parameter language model optimized with activation sparsity techniques, achieving high-sparsity inference acceleration through the ProSparse method while maintaining performance comparable to the original model.
Large Language Model
Transformers Supports Multiple Languages