G

Granite Timeseries Patchtst

Developed by ibm-granite
PatchTST is a Transformer-based time series forecasting model designed for long-term time series forecasting, utilizing subsequence patching and channel independence techniques to improve prediction accuracy.
Downloads 1,505
Release Time : 1/19/2024

Model Overview

This model is used for time series forecasting tasks, particularly suited for predicting seven channels in the power transformer dataset ETTh1. The model forecasts values for the next 96 hours based on the previous 512 hours of historical data.

Model Features

Subsequence patching technique
Divides time series into fixed-size subsequence patches as Transformer inputs, preserving local semantic information while reducing computational costs.
Channel independence
Each channel is processed as a univariate time series, sharing the same embedding and Transformer weights, enabling the model to focus on longer historical data.
Modular design
Supports masked time series pre-training as well as direct time series forecasting, classification, and regression tasks.

Model Capabilities

Time series forecasting
Long-term time series modeling
Multi-channel time series processing

Use Cases

Power systems
Power transformer load forecasting
Forecasts power transformer load for the next 96 hours
Achieves an MSE of 0.3881 on the ETTh1 test set
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase