P

Phi 3 Mini 4k Instruct Onnx

Developed by microsoft
Phi-3 Mini is a lightweight, cutting-edge open-source model focused on high-quality, high-inference-density data, supporting a 4K context length.
Downloads 370
Release Time : 4/23/2024

Model Overview

This is the Phi-3 Mini-4K-Instruct model converted for ONNX Runtime inference, supporting multiple hardware platforms including CPU, GPU, and mobile devices.

Model Features

Multi-hardware support
Supports running on CPU, GPU, and mobile devices, including Windows, Linux, and Mac platforms.
Efficient quantization
Provides int4 and fp16 quantized versions, optimizing inference performance while maintaining high accuracy.
High-performance inference
Performs excellently in ONNX Runtime, up to 10x faster than PyTorch.
Cross-platform compatibility
Supports AMD, Intel, and NVIDIA GPUs through DirectML, enabling cross-platform hardware acceleration.

Model Capabilities

Text generation
Instruction following
Conversational interaction

Use Cases

Dialogue systems
Joke generation
The model can generate humorous jokes based on user requests.
Why don't scientists trust atoms? Because they make up everything!
Code generation
Code assistance
Supports code generation and completion for languages like Python, C, and C++.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase