🚀 Chronos-Bolt⚡ (Small)
Chronos-Bolt is a family of pretrained time series forecasting models designed for zero-shot forecasting. It's built on the T5 encoder-decoder architecture and trained on nearly 100 billion time series observations. It divides historical time series context into patches of multiple observations and inputs them into the encoder. The decoder then uses these representations to directly generate quantile forecasts across multiple future steps, a method known as direct multi-step forecasting. Compared to the original Chronos models of the same size, Chronos-Bolt models are more accurate, up to 250 times faster, and 20 times more memory-efficient.
🚀 Quick Start
🚀 Update Feb 14, 2025: Chronos-Bolt models are now available on Amazon SageMaker JumpStart! Check out the tutorial notebook to learn how to deploy Chronos endpoints for production use in a few lines of code.
✨ Features
- High Performance: Significantly faster and more accurate than the original Chronos models.
- Zero-shot Forecasting: Can be used for forecasting without prior exposure to specific datasets.
- Multiple Sizes: Available in different sizes to meet various needs.
- Advanced Features: Supports fine-tuning and forecasting with covariates.
- Deployment Flexibility: Can be deployed to both CPU and GPU instances on SageMaker.
📦 Installation
Install the required dependencies for AutoGluon
pip install autogluon
Update the SageMaker SDK
pip install -U sagemaker
💻 Usage Examples
Basic Usage - Zero-shot inference with Chronos-Bolt in AutoGluon
from autogluon.timeseries import TimeSeriesPredictor, TimeSeriesDataFrame
df = TimeSeriesDataFrame("https://autogluon.s3.amazonaws.com/datasets/timeseries/m4_hourly/train.csv")
predictor = TimeSeriesPredictor(prediction_length=48).fit(
df,
hyperparameters={
"Chronos": {"model_path": "autogluon/chronos-bolt-small"},
},
)
predictions = predictor.predict(df)
Advanced Usage - Deploying a Chronos-Bolt endpoint to SageMaker
from sagemaker.jumpstart.model import JumpStartModel
model = JumpStartModel(
model_id="autogluon-forecasting-chronos-bolt-small",
instance_type="ml.c5.2xlarge",
)
predictor = model.deploy()
import pandas as pd
df = pd.read_csv("https://raw.githubusercontent.com/AileenNielsen/TimeSeriesAnalysisWithPython/master/data/AirPassengers.csv")
payload = {
"inputs": [
{"target": df["#Passengers"].tolist()}
],
"parameters": {
"prediction_length": 12,
}
}
forecast = predictor.predict(payload)["predictions"]
For more advanced features such as fine-tuning and forecasting with covariates, check out this tutorial. For more details about the endpoint API, check out the example notebook.
📚 Documentation
Performance Comparison
The following plot compares the inference time of Chronos-Bolt against the original Chronos models for forecasting 1024 time series with a context length of 512 observations and a prediction horizon of 64 steps.
The following plot reports the probabilistic and point forecasting performance of Chronos-Bolt in terms of the [Weighted Quantile Loss (WQL)](https://auto.gluon.ai/stable/tutorials/timeseries/forecasting-metrics.html#autogluon.timeseries.metrics.WQL) and the [Mean Absolute Scaled Error (MASE)](https://auto.gluon.ai/stable/tutorials/timeseries/forecasting-metrics.html#autogluon.timeseries.metrics.MASE), respectively, aggregated over 27 datasets (see the [Chronos paper](https://arxiv.org/abs/2403.07815) for details on this benchmark).
### Model Sizes
Chronos-Bolt models are available in the following sizes.
| Model | Parameters | Based on |
| ---------------------------------------------------------------------- | ---------- | ---------------------------------------------------------------------- |
| [**chronos-bolt-tiny**](https://huggingface.co/autogluon/chronos-bolt-tiny) | 9M | [t5-efficient-tiny](https://huggingface.co/google/t5-efficient-tiny) |
| [**chronos-bolt-mini**](https://huggingface.co/autogluon/chronos-bolt-mini) | 21M | [t5-efficient-mini](https://huggingface.co/google/t5-efficient-mini) |
| [**chronos-bolt-small**](https://huggingface.co/autogluon/chronos-bolt-small) | 48M | [t5-efficient-small](https://huggingface.co/google/t5-efficient-small) |
| [**chronos-bolt-base**](https://huggingface.co/autogluon/chronos-bolt-base) | 205M | [t5-efficient-base](https://huggingface.co/google/t5-efficient-base) |
📄 License
This project is licensed under the Apache-2.0 License.
📚 Citation
If you find Chronos or Chronos-Bolt models useful for your research, please consider citing the associated paper:
@article{ansari2024chronos,
title={Chronos: Learning the Language of Time Series},
author={Ansari, Abdul Fatir and Stella, Lorenzo and Turkmen, Caner and Zhang, Xiyuan, and Mercado, Pedro and Shen, Huibin and Shchur, Oleksandr and Rangapuram, Syama Syndar and Pineda Arango, Sebastian and Kapoor, Shubham and Zschiegner, Jasper and Maddix, Danielle C. and Mahoney, Michael W. and Torkkola, Kari and Gordon Wilson, Andrew and Bohlke-Schneider, Michael and Wang, Yuyang},
journal={Transactions on Machine Learning Research},
issn={2835-8856},
year={2024},
url={https://openreview.net/forum?id=gerNCVqqtR}
}