MobileLLM is Meta's series of sub-billion parameter language models specifically optimized for resource-constrained devices, significantly improving on-device inference efficiency through deep-narrow architecture design.
Large Language Model
Transformers