TimesFM — Time Series Foundation Model (Google Research)
Overview
TimesFM is a decoder-only pretrained foundation model developed by Google Research specifically for time-series forecasting. The GitHub repository hosts the open-source code, installation and usage instructions, model variants, and pointers to released checkpoints (for example via Hugging Face). The project aims to bring foundation-model style pretraining and inference workflows to forecasting problems, enabling researchers and engineers to fine-tune or directly use pretrained models for various time-series tasks.
Key features
- Decoder-only transformer architecture tailored for time-series forecasting.
- Pretrained checkpoints released and collected (Hugging Face collection referenced in the repo).
- Support for multiple backends: primary PyTorch examples, and plans / work toward a Flax implementation for faster inference.
- Configurable inference/forecasting options (context length, horizon, normalization, quantile heads, flags for invariances and positivity, etc.).
- Explicitly documented model versions with migration/compatibility notes (e.g., TimesFM 2.5 changes in parameter count, context length, and optional quantile head).
- Integration guidance for production use (notably TimesFM availability in Google BigQuery as a product integration).
Model versions and notable changes
- The repo documents multiple model versions and archives older code under subdirectories (v1 for earlier 1.0 / 2.0 artifacts).
- TimesFM 2.5 (released and documented in the repo) reduces parameter count compared to earlier releases, increases supported context length, and introduces an optional continuous quantile head for denser quantile forecasting. The repo lists migration notes and flags introduced in newer versions.
Installation & quick usage
The project provides step-by-step instructions to clone the repository, create a virtual environment, and install the package in editable mode with dependencies for chosen backends (PyTorch or Flax). Example usage in Python demonstrates loading a pretrained TimesFM 2.5 PyTorch model, compiling forecast configuration, and running a point + quantile forecast on example inputs.
Checkpoints, integrations & resources
- Official checkpoints are referenced via a Hugging Face collection (the repo points to that collection for model weights).
- The project homepage / blog post on Google Research explains the paper and model design; the repo links to the ICML 2024 paper and the research blog post.
- TimesFM is also surfaced in Google Cloud BigQuery documentation as an available product integration.
Typical use cases
- Short- and medium-horizon forecasting across many domains (retail, energy, finance, telemetry).
- Research into transfer learning and foundation-model approaches for temporal data.
- Serving pretrained forecasting models in production via inference stacks or direct BigQuery integrations.
Notes & repository status
- The repository states that the open version is not an officially supported Google product.
- The maintainers document future work areas (e.g., Flax support, restoring covariate/XReg support for newer model variants, expanded docs and notebooks).
References
- The repo links to the ICML 2024 paper describing the architecture and training approach, the Google Research blog post summarizing the work, and the Hugging Face collection for checkpoints.
(For usage examples and exact flags/config options, consult the repository README and in-repo examples.)
