M

Mobilellm 1.5B

Developed by facebook
MobileLLM is an optimized Transformer architecture language model developed by Meta, specifically designed for resource-constrained edge-side applications, with parameter scales ranging from 125M to 1.5B.
Downloads 89
Release Time : 11/26/2024

Model Overview

MobileLLM is an autoregressive language model that employs an optimized Transformer architecture, focusing on achieving efficient performance on resource-constrained devices.

Model Features

Edge-side optimized design
Designed specifically for resource-constrained devices, achieving efficient inference through architectural optimizations
High-performance small model
Significantly outperforms similar models of the same parameter scale
Fully trained
All models are trained on 1T tokens of data
Technical integration
Incorporates advanced technologies such as SwiGLU activation function and grouped query attention

Model Capabilities

Text generation
Common sense reasoning
Zero-shot learning

Use Cases

Mobile applications
Mobile device intelligent assistant
Delivers smooth conversational experiences on resource-constrained mobile devices
Research
Small model performance research
Explores the relationship between parameter efficiency and model performance
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase