Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Low-cost Pre-training
# Low-cost Pre-training
Colossal LLaMA 2 7b Base
An open-source bilingual Chinese-English large language model based on LLaMA-2, continuously pre-trained on approximately 8.5 billion tokens, supporting a context window of 4096 tokens.
Large Language Model
Transformers
Supports Multiple Languages
C
hpcai-tech
147
76
Featured Recommended AI Models
Empowering the Future, Your AI Solution Knowledge Base
English
简体中文
繁體中文
にほんご
© 2025
AIbase