D

Decapoda Research Llama 7B Hf

Developed by baffo32
LLaMA is an efficient foundational language model developed by Meta AI, available in parameter sizes ranging from 7B to 65B. Based on the Transformer architecture, it is suitable for various natural language processing tasks.
Downloads 12.29k
Release Time : 4/10/2023

Model Overview

LLaMA is an autoregressive language model based on the Transformer architecture, primarily designed for large language model research, including tasks such as question answering, natural language understanding, and reading comprehension.

Model Features

Efficient Training
Optimizes training efficiency using the standard Transformer architecture, delivering excellent performance across multiple benchmarks.
Multilingual Support
Training data covers 20 languages, with optimal performance in English but includes multilingual processing capabilities.
Multiple Sizes
Available in four parameter sizes (7B/13B/33B/65B) to meet different computational needs.

Model Capabilities

Text generation
Question answering systems
Natural language understanding
Reading comprehension
Common-sense reasoning

Use Cases

Academic Research
Language Model Research
Used to explore the boundaries and improvement techniques of large language models.
Bias Evaluation
Assesses model performance in terms of biases related to gender, religion, race, etc.
Average bias score: 66.6 (lower is better)
Application Development
Question Answering System
Builds knowledge-based question-answering applications.
Achieves 76.5% accuracy on BoolQ benchmark (7B model)
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase