S

Sheared LLaMA 1.3B Pruned

Developed by princeton-nlp
Sheared-LLaMA-1.3B-Pruned is a 1.3B parameter model pruned from Llama-2-7b, without continued pretraining, primarily used for studying pruning techniques and their impacts.
Downloads 25
Release Time : 1/23/2024

Model Overview

This model is a 1.3B parameter version obtained from Llama-2-7b through pruning techniques, without undergoing continued pretraining. It is mainly used for researching pruning techniques, data mixing strategies for continued pretraining, and evaluating the impact of pruning on model knowledge and reasoning capabilities.

Model Features

Efficient pruning
Pruned from a 7B parameter model to 1.3B parameters while retaining core capabilities
Research-oriented
Specifically designed for studying pruning techniques and their impact on model capabilities
Lightweight
Approximately 81% fewer parameters compared to the original 7B model, making it more suitable for resource-constrained research

Model Capabilities

Text generation
Language understanding

Use Cases

Academic research
Pruning technique research
Study the impact of different pruning methods on LLM performance
Knowledge retention evaluation
Assess the extent to which pruning preserves model knowledge and reasoning abilities
Model optimization
Lightweight model development
Serve as a foundation for developing smaller-scale LLMs
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase