AIbase
Home
AI Tools
AI Models
MCP
AI NEWS
EN
Model Selection
Tags
Pre-trained Foundation Model

# Pre-trained Foundation Model

Test Patchtst
Apache-2.0
PatchTST is a pre-trained time series foundation model, focusing on time series forecasting tasks.
Climate Model
T
ibm-research
5,593
0
Zamba 7B V1 Phase1
Apache-2.0
Zamba-7B-v1-phase1 is a hybrid architecture combining state space model Mamba with Transformer, using Mamba as the backbone network and sharing one Transformer layer every six modules, trained via next-token prediction.
Large Language Model Transformers
Z
Zyphra
22
5
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
English简体中文繁體中文にほんご
© 2025AIbase