Devstral Small 2507 4bit DWQ
This is a 4-bit quantized language model based on the MLX format, supporting various multilingual text generation tasks.
Downloads 159
Release Time : 7/12/2025
Model Overview
This model is a 4-bit DWQ quantized version converted from Devstral-Samll-2507-bf16, suitable for text generation tasks.
Model Features
4-bit quantization
Adopts 4-bit DWQ quantization technology to reduce model size while maintaining performance
Multilingual support
Supports text generation in 24 languages
MLX compatibility
Optimized for the MLX framework, facilitating operation on Apple devices
Model Capabilities
Text generation
Multilingual processing
Use Cases
Text generation
Dialogue system
Can be used to build multilingual chatbots
Content creation
Supports text content generation in multiple languages
Featured Recommended AI Models