🚀 Chronos-Bolt⚡ (Mini)
Chronos-Bolt is a family of pretrained time series forecasting models designed for zero-shot forecasting. It's built upon the T5 encoder-decoder architecture and trained on nearly 100 billion time series observations. By chunking historical time series context into patches of multiple observations and feeding them into the encoder, the decoder can directly generate quantile forecasts across multiple future steps, a method known as direct multi-step forecasting. Compared to the original Chronos models of the same size, Chronos-Bolt models are up to 250 times faster and 20 times more memory-efficient.
📢 Update
🚀 Feb 14, 2025: Chronos-Bolt models are now available on Amazon SageMaker JumpStart! Check out the tutorial notebook to learn how to deploy Chronos endpoints for production use in a few lines of code.
✨ Features
- High Speed: Significantly faster inference time compared to the original Chronos models.
- High Accuracy: Outperforms commonly used statistical and deep learning models, even in zero-shot scenarios.
- Multiple Sizes: Available in different sizes to meet various requirements.
- Easy to Use: Can be easily integrated with AutoGluon for fine-tuning, incorporating covariates, and ensembling.
- Deployable: Can be deployed to SageMaker endpoints for production use.
📦 Installation
The installation steps vary depending on the usage scenario. Here are the common installation commands:
pip install autogluon
- For SageMaker deployment:
pip install -U sagemaker
- For inference library usage:
pip install chronos-forecasting
💻 Usage Examples
📈 Basic Usage with AutoGluon
A minimal example showing how to perform zero-shot inference using Chronos-Bolt with AutoGluon:
from autogluon.timeseries import TimeSeriesPredictor, TimeSeriesDataFrame
df = TimeSeriesDataFrame("https://autogluon.s3.amazonaws.com/datasets/timeseries/m4_hourly/train.csv")
predictor = TimeSeriesPredictor(prediction_length=48).fit(
df,
hyperparameters={
"Chronos": {"model_path": "amazon/chronos-bolt-mini"},
},
)
predictions = predictor.predict(df)
🚀 Deploying a Chronos-Bolt endpoint to SageMaker
A minimal example showing how to deploy a Chronos-Bolt (Base) endpoint to SageMaker:
from sagemaker.jumpstart.model import JumpStartModel
model = JumpStartModel(
model_id="autogluon-forecasting-chronos-bolt-base",
instance_type="ml.c5.2xlarge",
)
predictor = model.deploy()
Now you can send time series data to the endpoint in JSON format.
import pandas as pd
df = pd.read_csv("https://raw.githubusercontent.com/AileenNielsen/TimeSeriesAnalysisWithPython/master/data/AirPassengers.csv")
payload = {
"inputs": [
{"target": df["#Passengers"].tolist()}
],
"parameters": {
"prediction_length": 12,
}
}
forecast = predictor.predict(payload)["predictions"]
🧐 Usage with inference library
A minimal example showing how to perform inference using Chronos-Bolt models:
import pandas as pd
import torch
from chronos import BaseChronosPipeline
pipeline = BaseChronosPipeline.from_pretrained(
"amazon/chronos-bolt-mini",
device_map="cuda",
torch_dtype=torch.bfloat16,
)
df = pd.read_csv(
"https://raw.githubusercontent.com/AileenNielsen/TimeSeriesAnalysisWithPython/master/data/AirPassengers.csv"
)
forecast = pipeline.predict(
context=torch.tensor(df["#Passengers"]), prediction_length=12
)
📚 Documentation
For more details on using Chronos with AutoGluon, check out the AutoGluon Chronos tutorial. For deploying Chronos-Bolt endpoints to SageMaker, refer to the example notebook.
🔧 Technical Details
Performance Comparison
The following plot compares the inference time of Chronos-Bolt against the original Chronos models for forecasting 1024 time series with a context length of 512 observations and a prediction horizon of 64 steps.
The following plot reports the probabilistic and point forecasting performance of Chronos-Bolt in terms of the Weighted Quantile Loss (WQL) and the Mean Absolute Scaled Error (MASE), respectively, aggregated over 27 datasets (see the Chronos paper for details on this benchmark).
Model Sizes
Chronos-Bolt models are available in the following sizes:
📄 License
This project is licensed under the Apache-2.0 License.
📖 Citation
If you find Chronos or Chronos-Bolt models useful for your research, please consider citing the associated paper:
@article{ansari2024chronos,
title={Chronos: Learning the Language of Time Series},
author={Ansari, Abdul Fatir and Stella, Lorenzo and Turkmen, Caner and Zhang, Xiyuan, and Mercado, Pedro and Shen, Huibin and Shchur, Oleksandr and Rangapuram, Syama Syndar and Pineda Arango, Sebastian and Kapoor, Shubham and Zschiegner, Jasper and Maddix, Danielle C. and Mahoney, Michael W. and Torkkola, Kari and Gordon Wilson, Andrew and Bohlke-Schneider, Michael and Wang, Yuyang},
journal={Transactions on Machine Learning Research},
issn={2835-8856},
year={2024},
url={https://openreview.net/forum?id=gerNCVqqtR}
}