O

Openelm 1 1B

Developed by apple
OpenELM is a series of efficient language models introduced by Apple, utilizing a hierarchical scaling strategy to optimize parameter allocation, offering pretrained and instruction-tuned models ranging from 270M to 3B parameters.
Downloads 683
Release Time : 4/25/2025

Model Overview

OpenELM is an open series of efficient language models focused on enhancing model performance through hierarchical scaling strategies, suitable for various natural language processing tasks.

Model Features

Hierarchical Scaling Strategy
Intelligently allocates parameters across each layer of the Transformer model to improve efficiency.
Open Research Framework
Provides complete training, fine-tuning, and evaluation workflows to promote open research.
Multi-scale Options
Offers model versions with parameter sizes ranging from 270M to 3B.

Model Capabilities

Text Generation
Zero-shot Learning
Instruction Following

Use Cases

Natural Language Processing
Open-domain QA
Answers natural language questions across various domains.
Performs well on benchmarks like ARC-c.
Text Completion
Generates coherent text content based on prompts.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase