L

Llama 4 Scout 17B 16E Linearized Bnb Nf4 Bf16

Developed by axolotl-quants
Llama 4 Scout is a 17-billion-parameter Mixture of Experts (MoE) model released by Meta, supporting multilingual text and image understanding with a linearized expert module design for PEFT/LoRA compatibility.
Downloads 6,861
Release Time : 4/7/2025

Model Overview

A multimodal AI model based on the Mixture of Experts architecture, excelling in text generation, image understanding, and code generation, with support for 12 languages.

Model Features

Linearized Expert Modules
Expert modules are specially processed to achieve linearization, significantly improving compatibility with fine-tuning techniques like PEFT/LoRA.
Multimodal Support
Supports early fusion for joint text and image processing, with an image understanding limit of 5 input images.
Long Context Processing
The Scout model supports a context length of 10M tokens, while Maverick supports 1M tokens.

Model Capabilities

Multilingual Text Generation
Image Content Understanding
Code Generation and Completion
Long Document Translation
Multi-turn Dialogue

Use Cases

Business Applications
Intelligent Customer Service
Deploy multilingual customer service systems supporting mixed text-image Q&A
Achieved a score of 79.6 on the MMLU benchmark
Research & Development
Synthetic Data Generation
Utilize the model to generate training data for downstream tasks
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase