AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Efficient Tokenizer

# Efficient Tokenizer

Swallow MS 7b V0.1
Apache-2.0
Swallow-MS-7b-v0.1 is a Japanese-enhanced model based on Mistral-7B-v0.1 with continued pretraining, developed by TokyoTech-LLM, demonstrating excellent performance on Japanese tasks.
Large Language Model Transformers Supports Multiple Languages
S
tokyotech-llm
736
27
Colossal LLaMA 2 7b Base
An open-source bilingual Chinese-English large language model based on LLaMA-2, continuously pre-trained on approximately 8.5 billion tokens, supporting a context window of 4096 tokens.
Large Language Model Transformers Supports Multiple Languages
C
hpcai-tech
147
76
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase