L

Llama 4 Scout 17B 16E Unsloth Bnb 4bit

Developed by unsloth
Llama 4 Scout is a multimodal mixture-of-experts model developed by Meta, supporting 12 languages and image understanding, with 17 billion active parameters and a 10M context length.
Downloads 2,492
Release Time : 4/6/2025

Model Overview

Based on an autoregressive language model with a mixture-of-experts architecture, it supports multilingual text generation, code generation, and image understanding, suitable for both commercial and research applications.

Model Features

Mixture-of-Experts Architecture
Features a 16-expert design with 17 billion active parameters and 109 billion total parameters, balancing performance and efficiency.
Multimodal Support
Supports text and image inputs, enabling early multimodal fusion.
Long Context Processing
10M token context window, ideal for handling long documents and complex tasks.
Dynamic Quantization
Supports BF16 weights and real-time int4 quantization, significantly reducing deployment resource requirements.

Model Capabilities

Multilingual text generation
Code generation
Image understanding
Visual reasoning
Long document processing

Use Cases

Commercial Applications
Intelligent Customer Service
Multilingual customer support system
Real-time conversation support in 12 languages
Document Analysis
Automatic summarization of long contracts/reports
10M token context processing capability
Research & Development
Multimodal Research
Platform for joint text-image understanding experiments
73.4% accuracy in joint image-text reasoning (MMMU)
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase