T

Timemoe 200M

Developed by Maple728
TimeMoE-200M is a billion-scale time series foundation model based on the Mixture of Experts (MoE) architecture, focusing on time series forecasting tasks.
Downloads 14.01k
Release Time : 9/21/2024

Model Overview

TimeMoE-200M is a foundation model specifically designed for time series forecasting, employing a Mixture of Experts architecture to handle large-scale time series data.

Model Features

Mixture of Experts Architecture
Utilizes a Mixture of Experts (MoE) architecture to efficiently process large-scale time series data.
Billion-scale Model
The model's parameter scale reaches the billion level, making it suitable for complex time series forecasting tasks.
Foundation Model
As a foundation model, it can adapt to various time series forecasting scenarios.

Model Capabilities

Time Series Forecasting
Large-scale Data Processing

Use Cases

Time Series Forecasting
Financial Time Series Forecasting
Used for predicting financial time series data such as stock prices and exchange rates.
Meteorological Data Forecasting
Used for predicting meteorological data such as temperature and precipitation.
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
© 2025AIbase