🚀 PatchTSMixer model pre-trained on ETTh1 dataset
The PatchTSMixer
is a lightweight and high - speed multivariate time series forecasting model. It achieves state - of - the - art performance on benchmark datasets. Here, we present a pre - trained PatchTSMixer
model that covers all seven channels of the ETTh1
dataset. When forecasting 96 hours into the future with a 512 - hour historical data window, this pre - trained model yields a Mean Squared Error (MSE) of 0.37 on the test
split of the ETTh1
dataset.
For training and evaluating a PatchTSMixer
model, you can refer to this notebook.
✨ Features
Model Details
The PatchTSMixer model was proposed in TSMixer: Lightweight MLP - Mixer Model for Multivariate Time Series Forecasting by Vijay Ekambaram, Arindam Jati, Nam Nguyen, Phanwadee Sinthong and Jayant Kalagnanam.
PatchTSMixer is a lightweight time - series modeling approach based on the MLP - Mixer architecture. In this HuggingFace implementation, it can easily perform lightweight mixing across patches, channels, and hidden features for effective multivariate time - series modeling. It also supports various attention mechanisms, from simple gated attention to more complex self - attention blocks that can be customized. The model can be pre - trained and then used for various downstream tasks such as forecasting, classification, and regression.
Model Description
TSMixer is a lightweight neural architecture composed solely of multi - layer perceptron (MLP) modules, designed for multivariate forecasting and representation learning on patched time series. Inspired by the success of MLP - Mixer models in computer vision, we address the challenges of adapting Vision MLP - Mixer for time series and introduce empirically validated components to improve accuracy. This includes a novel design of attaching online reconciliation heads to the MLP - Mixer backbone to explicitly model time - series properties like hierarchy and channel - correlations. We also propose a Hybrid channel modeling approach to handle noisy channel interactions and generalization across diverse datasets, a common issue in existing patch channel - mixing methods. Additionally, a simple gated attention mechanism is introduced in the backbone to prioritize important features. By incorporating these lightweight components, we enhance the learning ability of simple MLP structures, outperforming complex Transformer models with minimal computing resources. Moreover, TSMixer’s modular design is compatible with both supervised and masked self - supervised learning methods, making it a promising building block for time - series Foundation Models. TSMixer outperforms state - of - the - art MLP and Transformer models in forecasting by 8 - 60% and the latest strong benchmarks of Patch - Transformer models by 1 - 2%, while significantly reducing memory and runtime (2 - 3X).

Model Sources
📦 Installation
Not provided in the original document, so this section is skipped.
💻 Usage Examples
This pre - trained model can be used for fine - tuning or evaluation on any Electrical Transformer dataset with the same channels as the ETTh1
dataset, namely HUFL, HULL, MUFL, MULL, LUFL, LULL, OT
. The model predicts the next 96 hours based on the input values from the preceding 512 hours. It is essential to normalize the data. For more details on data pre - processing, please refer to the paper or the demo.
You can use the following code to get started with the model:
Demo
📚 Documentation
Training Details
Training Data
ETTh1
/train split.
The train/validation/test splits are shown in the demo.
Training Hyperparameters
Please refer to the PatchTSMixer paper.
Evaluation
Testing Data, Factors & Metrics
Testing Data
ETTh1
/test split.
The train/validation/test splits are shown in the demo.
Metrics
Mean Squared Error (MSE).
Results

Hardware
1 NVIDIA A100 GPU
Software
PyTorch
📄 License
The model is licensed under the Apache - 2.0 license.
📚 Citation
BibTeX:
@article{ekambaram2023tsmixer,
title={TSMixer: Lightweight MLP - Mixer Model for Multivariate Time Series Forecasting},
author={Ekambaram, Vijay and Jati, Arindam and Nguyen, Nam and Sinthong, Phanwadee and Kalagnanam, Jayant},
journal={arXiv preprint arXiv:2306.09364},
year={2023}
}
APA:
Ekambaram, V., Jati, A., Nguyen, N., Sinthong, P., & Kalagnanam, J. (2023). TSMixer: Lightweight MLP - Mixer Model for Multivariate Time Series Forecasting. arXiv preprint arXiv:2306.09364.
Property |
Details |
Model Type |
PatchTSMixer, a lightweight and fast multivariate time series forecasting model |
Training Data |
ETTh1 /train split |
Metrics |
Mean Squared Error (MSE) |
Pipeline Tag |
time - series - forecasting |
Tags |
time series, forecasting, pretrained models, foundation models, time series foundation models, time - series |