T

Timemoe 50M

Developed by Maple728
TimeMoE is a billion-scale time series foundation model based on the Mixture of Experts (MoE) architecture, focusing on time series forecasting tasks.
Downloads 22.02k
Release Time : 9/21/2024

Model Overview

TimeMoE is a large-scale time series forecasting model that adopts the Mixture of Experts architecture, capable of handling complex time series data patterns.

Model Features

Mixture of Experts architecture
Adopts the MoE architecture to efficiently process large-scale time series data
Large-scale parameters
The model has billion-scale parameters, suitable for complex time series patterns
Foundation model
As a foundation model in the time series domain, it can adapt to various forecasting tasks

Model Capabilities

Time series forecasting
Long-term dependency modeling
Multivariate time series processing

Use Cases

Finance
Stock price prediction
Predict future stock price trends
Energy
Electricity demand forecasting
Predict future electricity consumption patterns
Retail
Sales forecasting
Predict future product sales trends
Featured Recommended AI Models
AIbase
Empowering the Future, Your AI Solution Knowledge Base
Š 2025AIbase