G

Granite Timeseries Ttm R1

Developed by ibm-granite
TinyTimeMixers (TTMs) are compact pre-trained models for multivariate time series forecasting open-sourced by IBM Research, with fewer than 1 million parameters, excelling in zero-shot and few-shot forecasting tasks.
Downloads 1.2M
Release Time : 4/5/2024

Model Overview

TTM is a lightweight time series forecasting model pre-trained on public time series data using various enhancement techniques, offering state-of-the-art zero-shot forecasting capabilities and achieving competitive multivariate forecasting performance with just 5% of training data through fine-tuning.

Model Features

Lightweight design
With fewer than 1 million parameters, it pioneers the concept of 'miniature' pre-trained time series forecasting models, suitable for low-resource deployment.
Zero-shot forecasting capability
Outperforms several popular benchmark models requiring billions of parameters in zero-shot forecasting tasks.
Rapid fine-tuning
Achieves competitive multivariate forecasting performance with just 5% of training data through fine-tuning.
Focused pre-training
Each pre-trained TTM targets specific forecasting scenarios (determined by context length and forecast length), maintaining high accuracy.

Model Capabilities

Multivariate time series forecasting
Zero-shot forecasting
Few-shot fine-tuned forecasting
Supports exogenous variable forecasting
Supports static categorical features
Rolling forecasting

Use Cases

Time series forecasting
Electricity demand forecasting
Forecasts future electricity demand, suitable for smart grid management.
Performs excellently on the Australian electricity demand dataset.
Weather forecasting
Forecasts future weather changes, suitable for meteorological predictions.
Performs excellently on the Australian weather dataset.
Financial time series forecasting
Forecasts financial time series such as Bitcoin prices.
Performs excellently on the Bitcoin dataset.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase