M

Mistral Small 24B Instruct 2501 AWQ

Developed by stelterlab
Mistral Small 3 (version 2501) is a 24B-parameter instruction-tuned large language model that sets a new benchmark in the sub-70B parameter category, featuring exceptional knowledge density and multilingual support capabilities.
Downloads 52.55k
Release Time : 1/30/2025

Model Overview

This is a 24B-parameter large language model optimized through instruction tuning, suitable for scenarios such as dialogue agents, function calls, and local inference.

Model Features

Multilingual Support
Supports dozens of languages including English, French, German, Spanish, Italian, Chinese, etc.
Agent Core
Provides top-tier agent capabilities with native function calling and JSON output support
Advanced Reasoning
Equipped with state-of-the-art dialogue and reasoning capabilities
Local Deployment
Can run on a single RTX 4090 GPU or a MacBook with 32GB memory after quantization

Model Capabilities

Multilingual text generation
Dialogue systems
Function calling
JSON format output
Instruction following

Use Cases

Dialogue Agent
Multilingual Conversation
Supports fluent conversations in multiple languages
Can generate responses that conform to linguistic conventions
Function Calling
API Integration
Can serve as an agent core to integrate external APIs
Supports JSON format output and tool calling
Local Inference
Sensitive Data Processing
Suitable for scenarios requiring local processing of sensitive data
Can run on local devices to protect data privacy
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase