B

Baichuan 7B

Developed by baichuan-inc
Baichuan-7B is an open-source large-scale pre-trained language model developed by Baichuan Intelligence, based on the Transformer architecture with 7 billion parameters. Trained on bilingual Chinese-English corpus, it supports a context window of 4096 tokens.
Downloads 20.47k
Release Time : 6/13/2023

Model Overview

A large-scale bilingual Chinese-English pre-trained language model that excels in authoritative benchmarks like C-EVAL/MMLU, supporting tasks such as text generation.

Model Features

Bilingual Optimization (Chinese-English)
Utilizes independently constructed bilingual training corpus, deeply optimized for Chinese scenarios, achieving top performance in C-Eval evaluations.
Permissive Open Source License
Adopts a more permissive open-source license compared to LLaMA (which prohibits commercial use), allowing commercial applications.
Long Context Support
Supports a 4096-token context window, suitable for long-text processing tasks.

Model Capabilities

Text Generation
Language Understanding
Question Answering Systems
Text Summarization

Use Cases

Education
Literary Work Analysis
Infer author information based on work titles
Correctly output 'Night Rain Sent North -> Li Shangyin' in sample tests
Evaluation Systems
Gaokao Question Answering
Answering Chinese Gaokao multiple-choice questions
Average score: 36.24
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase