S

Stable Vicuna 13B GPTQ

Developed by TheBloke
StableVicuna-13B is a dialogue model fine-tuned via RLHF based on Vicuna-13B v0, using 4-bit GPTQ quantization format
Downloads 49
Release Time : 4/28/2023

Model Overview

This is a quantized 13B-parameter dialogue model suitable for text generation tasks, specially optimized for conversational interaction

Model Features

4-bit GPTQ quantization
The model is processed with 4-bit quantization, significantly reducing memory usage while maintaining good inference quality
RLHF fine-tuning
Fine-tuned via Reinforcement Learning from Human Feedback (RLHF) to optimize conversational interaction
Multi-dataset training
Trained on multiple high-quality dialogue datasets including OASST1, GPT4All, and Alpaca

Model Capabilities

Text generation
Dialogue interaction
Instruction following

Use Cases

Dialogue system
Intelligent assistant
Can be used to build intelligent conversational assistants
Capable of generating natural and fluent dialogue responses
Content generation
Creative writing
Assists in story creation and content generation
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase