Time Series
Foundation Models
Time Series Foundation Models – Nathaniel Shimoni © 2024
• What is a foundational time series model?
• All pre-mentioned characteristics:
• Granularity
• Channels
• Domain
• Cross dependencies (time related, channel related, both, externalities)
• Pretraining task
• Forecasting (short / long term), imputation, reconstruction & AE,
• Data availability
• Large institutes
• Extendibility
Understanding Time Series Foundation Models
Time Series Foundation Models – Nathaniel Shimoni ©
Background - PatchTST
Time Series Foundation Models – Nathaniel Shimoni ©
Taxonomy of TSFMs
Time Series Foundation Models – Nathaniel Shimoni ©
https://arxiv.org/pdf/2403.14735.pdf
Taxonomy of TSFMs
Time Series Foundation Models – Nathaniel Shimoni ©
https://arxiv.org/pdf/2403.14735.pdf
Chronos – a Univariate General Forecaster
Time Series Foundation Models – Nathaniel Shimoni ©
1. AWS team paper
2. Using quantiles as
tokenizers
3. Based on Google’s
T5 model
architecture with
similar training as in
language
TimesFM (Google)
Time Series Foundation Models – Nathaniel Shimoni ©
1. Created a decoder-only
transformer model for time-
series forecasting, which is
better suited than large
language models like GPT-3
and Llama-2.
2. Trained on large corpus
containing Google trends,
Wikipedia page views and
synthetic data.
3. Surpassed statistics-based
methods on complex data.
4. Analyzed forecasting models
and demonstrated
generalization through diverse
dataset evaluations.
MOIRAI - Salesforce
Time Series Foundation Models – Nathaniel Shimoni ©
1. Introduced MOIRAI, a universal
time series forecasting
Transformer.
2. Addressed challenges with a
novel Transformer
3. Leveraged LOTSA dataset for
pre-training, with 27B
observations across 13
domains.
4. Demonstrated competitive or
superior performance
compared to state-of-the-art
baselines.
Moment – a Family of TSFM
Time Series Foundation Models – Nathaniel Shimoni ©
1. Developed MOMENT, a
transformer model pre-trained
on masked time-series
prediction.
2. Evaluated MOMENT's
effectiveness across various
tasks, demonstrating superior
performance in anomaly
detection and imputation.
3. Investigated MOMENT's ability
to capture time-series
characteristics without explicit
training, providing insights into
its interpretability and
generalization.
Other significant improvements - iTransformer
Time Series Foundation Models – Nathaniel Shimoni ©
• Multi-Target Training
• Multi-Modal Models
• Segregation – Task-Specific foundation models
Data-Specific foundation models
Future Directions
Time Series Foundation Models – Nathaniel Shimoni ©

Time Series Foundation Models - current state and future directions

  • 1.
    Time Series Foundation Models TimeSeries Foundation Models – Nathaniel Shimoni © 2024
  • 2.
    • What isa foundational time series model? • All pre-mentioned characteristics: • Granularity • Channels • Domain • Cross dependencies (time related, channel related, both, externalities) • Pretraining task • Forecasting (short / long term), imputation, reconstruction & AE, • Data availability • Large institutes • Extendibility Understanding Time Series Foundation Models Time Series Foundation Models – Nathaniel Shimoni ©
  • 3.
    Background - PatchTST TimeSeries Foundation Models – Nathaniel Shimoni ©
  • 4.
    Taxonomy of TSFMs TimeSeries Foundation Models – Nathaniel Shimoni © https://arxiv.org/pdf/2403.14735.pdf
  • 5.
    Taxonomy of TSFMs TimeSeries Foundation Models – Nathaniel Shimoni © https://arxiv.org/pdf/2403.14735.pdf
  • 6.
    Chronos – aUnivariate General Forecaster Time Series Foundation Models – Nathaniel Shimoni © 1. AWS team paper 2. Using quantiles as tokenizers 3. Based on Google’s T5 model architecture with similar training as in language
  • 7.
    TimesFM (Google) Time SeriesFoundation Models – Nathaniel Shimoni © 1. Created a decoder-only transformer model for time- series forecasting, which is better suited than large language models like GPT-3 and Llama-2. 2. Trained on large corpus containing Google trends, Wikipedia page views and synthetic data. 3. Surpassed statistics-based methods on complex data. 4. Analyzed forecasting models and demonstrated generalization through diverse dataset evaluations.
  • 8.
    MOIRAI - Salesforce TimeSeries Foundation Models – Nathaniel Shimoni © 1. Introduced MOIRAI, a universal time series forecasting Transformer. 2. Addressed challenges with a novel Transformer 3. Leveraged LOTSA dataset for pre-training, with 27B observations across 13 domains. 4. Demonstrated competitive or superior performance compared to state-of-the-art baselines.
  • 9.
    Moment – aFamily of TSFM Time Series Foundation Models – Nathaniel Shimoni © 1. Developed MOMENT, a transformer model pre-trained on masked time-series prediction. 2. Evaluated MOMENT's effectiveness across various tasks, demonstrating superior performance in anomaly detection and imputation. 3. Investigated MOMENT's ability to capture time-series characteristics without explicit training, providing insights into its interpretability and generalization.
  • 10.
    Other significant improvements- iTransformer Time Series Foundation Models – Nathaniel Shimoni ©
  • 11.
    • Multi-Target Training •Multi-Modal Models • Segregation – Task-Specific foundation models Data-Specific foundation models Future Directions Time Series Foundation Models – Nathaniel Shimoni ©