A curated list of AI tools and resources for developers, see the AI Resources .

TimesFM

TimesFM is a pretrained time-series foundation model from Google Research designed for long-context forecasting and uncertainty estimation.

Detailed Introduction

TimesFM (Time Series Foundation Model) from Google Research is a pretrained foundation model tailored for time-series forecasting. It uses a decoder-only architecture optimized for long-context inputs and uncertainty modeling, supporting continuous quantile forecasts and scalable context lengths. The open repository provides implementations, training and inference scripts, example configs, and pretrained checkpoints for research and engineering use.

Main Features

  • Pretrained foundation checkpoints in multiple scales for fine-tuning or direct inference.
  • Long-context support to improve long-horizon forecasting.
  • Uncertainty estimation via continuous quantile heads to produce prediction intervals.
  • Engineering-ready implementations with PyTorch/Flax support and performance options.

Use Cases

  • Demand forecasting and inventory planning in retail and supply chains.
  • Energy and traffic forecasting for grids, networks, and infrastructure metrics.
  • Financial time-series forecasting with interval estimates for risk-aware decisions.
  • Research baseline and pretrained model for time-series experiments.

Technical Features

  • Architecture: decoder-only transformer with time-series specific input encodings and normalization.
  • Extensibility: supports PyTorch/Flax backends, quantile head plugin, and covariate (exogenous) inputs.
  • Performance: async/parallel inference and optimizations for different hardware backends.

Comments

TimesFM
Resource Info
🌱 Open Source 🏗️ Model 💾 Data