Deepseek R1 Distill Llama 70B Abliterated Mlx 4Bit
This is a distilled model based on Llama-70B, converted to MLX format via mlx-lm and quantized to 4 bits.
Downloads 358
Release Time : 4/25/2025
Model Overview
This model is an MLX format version converted from huihui-ai/DeepSeek-R1-Distill-Llama-70B-abliterated, mainly used for text generation tasks.
Model Features
4-bit Quantization
The model is quantized to 4 bits, reducing the computational resource requirements
MLX Format
Converted to MLX format, optimizing the running efficiency on Apple chips
Distilled Model
A distilled version based on Llama-70B, reducing the model size while maintaining high performance
Model Capabilities
Text Generation
Dialogue System
Use Cases
Natural Language Processing
Dialogue Generation
Can be used to build dialogue systems and generate natural and fluent responses
Content Creation
Assist in text creation, such as article writing and story generation
Featured Recommended AI Models