7

7B

Developed by CausalLM
A 7B-parameter causal language model compatible with Meta LLaMA 2 architecture, outperforming similar models under 33B in multiple evaluations
Downloads 177
Release Time : 10/22/2023

Model Overview

A causal language model trained on Qwen and LLaMA2 weights, supporting Chinese and English text generation tasks, using the same model architecture as LLaMA2

Model Features

High performance
Outperforms similar 7B models on benchmarks like MMLU, CEval, and GSM8K, with some metrics surpassing models under 33B
Multilingual support
Supports both English and Chinese text generation
Full compatibility
Compatible with GGUF, GPTQ, and AWQ quantization formats, can be directly loaded using the transformers library
Synthetic data training
Trained on 1.3 billion tokens of carefully curated SFT datasets, all data has been manually or synthetically rewritten

Model Capabilities

Text generation
Multi-turn dialogue
Mathematical reasoning
Knowledge Q&A

Use Cases

Dialogue systems
Intelligent assistant
Building multi-turn dialogue systems based on chatml format
Education
Math problem solving
Solving GSM8K-level math problems
Zero-shot accuracy 59.2%
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase