MiniCPM is a series of edge large language models jointly open-sourced by Mianbi Intelligence and the Natural Language Processing Laboratory of Tsinghua University. The core language model, MiniCPM-2B, only contains 2.4 billion non-word embedding parameters.
Large Language Model
Transformers Supports Multiple Languages