Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Fully trainable parameters
# Fully trainable parameters
Chinese Llama 2 7b 16k
Apache-2.0
Chinese-LLaMA-2-7B-16K is a Chinese large language model developed based on Meta's Llama-2, supporting 16K context length, suitable for inference and full-parameter training.
Large Language Model
Transformers
Supports Multiple Languages
C
hfl
57
11
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase