AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
ALiBi attention mechanism

# ALiBi attention mechanism

Refact 1 6B Fim
Openrail
Refact-1.6B is a model designed for code generation and completion, supporting multiple languages and chat functions, with performance approaching that of larger-scale models.
Large Language Model Transformers
R
refactai
10.06k
136
Mpt 7b
Apache-2.0
MPT-7B is an open-source commercial large language model trained by MosaicML. It is pre-trained on 1 trillion tokens of English text and code, and uses an improved Transformer architecture to optimize training and inference efficiency.
Large Language Model Transformers Other
M
mosaicml
27.19k
1,168
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase