Q

Qwen 7B Chat GPTQ

Developed by openerotica
A 7-billion-parameter large language model developed by Alibaba Cloud, based on the Transformer architecture, supporting both Chinese and English languages as well as code processing, with multi-turn dialogue capabilities.
Downloads 26
Release Time : 8/9/2023

Model Overview

Qwen-7B-Chat is an AI assistant developed from the Qwen-7B base model through alignment mechanisms, specifically optimized for multi-turn dialogue interaction scenarios.

Model Features

Efficient Tokenization
Optimized 150K token vocabulary, supporting efficient encoding for Chinese, English, and code, utilizing tiktoken tokenization technology.
Long-text Understanding
Supports 2048 tokens context length, achieving a score of 16.6 on the VCSUM Chinese long-text summarization task.
Multi-domain Capabilities
Balanced performance across programming, mathematics, social sciences, etc., with a Pass@1 rate of 24.4% on HumanEval coding tasks.
Tool Calling
Supports ReAct prompting and HuggingFace Agent, with tool selection accuracy as high as 99%.

Model Capabilities

Multi-turn Dialogue
Text Generation
Code Interpretation
Mathematical Reasoning
Tool Calling
Long-text Summarization

Use Cases

Intelligent Assistant
Multi-turn Dialogue System
Building a context-aware multi-turn dialogue AI assistant.
Average accuracy of 54.6% on the C-Eval Chinese test set.
Educational Applications
Programming Learning Assistance
Explaining code and generating programming examples.
24.4% pass rate on HumanEval code generation.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase