Mobilellama 1.4B Base
M
Mobilellama 1.4B Base
Developed by mtgv
MobileLLaMA-1.4B-Base is a Transformer model with 1.4 billion parameters, designed for out-of-the-box deployment and trained on the RedPajama v1 dataset.
Downloads 1,376
Release Time : 12/28/2023
Model Overview
This model is a scaled-down version of LLaMA, aimed at efficient deployment and use, suitable for natural language understanding and commonsense reasoning tasks.
Model Features
Scaled-down
A scaled-down version of LLaMA for easy out-of-the-box deployment.
Research Reproducibility
All models are trained solely on 1.3 trillion tokens from the RedPajama v1 dataset, facilitating controlled variable experiments.
High Performance
Performs comparably to the latest open-source models on standard natural language benchmarks.
Model Capabilities
Text Generation
Natural Language Understanding
Commonsense Reasoning
Use Cases
Natural Language Processing
Language Understanding
Used for evaluating and understanding natural language text.
Performs well on standard benchmarks.
Commonsense Reasoning
Used for reasoning and answering commonsense questions.
Performs well on standard benchmarks.
Featured Recommended AI Models
Š 2025AIbase