Pygmalion 6b 4bit 128g
A 4-bit GPTQ quantized model based on Pygmalion-6B, suitable for dialogue generation tasks, supporting English text generation
Downloads 40
Release Time : 3/28/2023
Model Overview
This is a 4-bit quantized dialogue generation model implemented based on Pygmalion-6B, compressing model size while maintaining good generation quality through GPTQ technology
Model Features
4-bit Quantization
Uses GPTQ technology to achieve 4-bit quantization, significantly reducing model storage requirements
128 Group Size
Adopts a group size of 128 during quantization to balance quantization accuracy and computational efficiency
Safe Tensor Format
Model is saved in safetensors format, providing a safer way to load models
Dialogue Optimization
Text generation capabilities optimized for dialogue scenarios (speculated)
Model Capabilities
English Text Generation
Conversational Interaction
Quantized Inference
Use Cases
Dialogue Systems
Virtual Character Dialogue
Engage in natural language conversations with AI characters
Generates coherent responses that align with character settings
Chatbot
Build lightweight English chatbots
Achieves smooth conversations in resource-constrained environments
Content Generation
Creative Writing Assistance
Assists users in English creative writing
Generates coherent storylines or dialogue content
Featured Recommended AI Models
Š 2025AIbase