AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Fully trainable parameters

# Fully trainable parameters

Chinese Llama 2 7b 16k
Apache-2.0
Chinese-LLaMA-2-7B-16K is a Chinese large language model developed based on Meta's Llama-2, supporting 16K context length, suitable for inference and full-parameter training.
Large Language Model Transformers Supports Multiple Languages
C
hfl
57
11
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase