FlowState: Sampling Rate Invariant Time Series Forecasting

Source

Core Claim

FlowState argues that a compact SSM-based time-series foundation model can generalize across context lengths, target lengths, and sampling rates by encoding the observed history into coefficient space and decoding forecasts through continuous functional basis functions.

Benchmarked Model Entry

  • Model: FlowState
  • Family: SSM encoder plus functional basis decoder.
  • Parameters: main paper variants include 2.6M and 9.1M parameter models; later official model cards also track FlowState release revisions.
  • Primary task surface: zero-shot time-series forecasting.
  • Evaluation surface: GIFT-ZS, Chronos-ZS, full GIFT-Eval-style comparisons, and controlled Loop Seattle sampling-rate experiments.

Key Contributions

  • Uses an S5/SSM encoder to process time series directly without patching or quantization.
  • Introduces a functional basis decoder that interprets encoder outputs as coefficients of a continuous forecast function.
  • Samples the continuous forecast at a requested interval, allowing the forecast horizon and sampling rate to change at inference time.
  • Uses a scale factor on the SSM and decoder discretization parameter so the model can adjust to unseen sampling rates.
  • Trains with parallel forecasts from many context lengths, increasing training signal and robustness.

Method Notes

FlowState is not a number-tokenization method in the language-model sense. Its relevance to numeric feature encoding is architectural: it moves from raw numeric observations into a coefficient space, then maps coefficients back to numeric forecasts with continuous basis functions.

The default decoder uses Legendre polynomials, while the paper reports that Fourier basis and half-Legendre variants perform similarly in its ablation. That makes FlowState a useful companion source for the question of when periodic or polynomial bases should be used for time-series values.

The official IBM model cards describe FlowState as research and Granite release weights tied to the same arXiv paper. Treat those model-card claims as live artifact metadata, and the converted arXiv paper as the source of record for this ingest.

For this wiki, FlowState remains a passive forecasting model. It models observed dynamics and sampling-rate adaptation, but it does not expose action, control input, intervention, or counterfactual rollout semantics.

Evidence And Results

The paper reports strong zero-shot forecasting results from small models, including 2.6M and 9.1M variants, and emphasizes that FlowState remains competitive against much larger baselines. The Loop Seattle subsampling experiment is the clearest evidence for the sampling-rate claim: FlowState is evaluated across 5-minute to 65-minute sampling intervals and is reported to generalize especially well at uncommon intervals.

The ablation section supports the core architecture: removing time-scale adjustment degrades performance, and decoder variants based on Fourier or half-Legendre bases remain close to the default Legendre design.

Limitations

The source is centered on forecasting metrics and passive dynamics. It does not directly answer whether continuous basis decoding helps encode known future exogenous variables, control inputs, interventions, or auxiliary numeric metadata. Its evidence also depends on benchmark protocol details that should be routed through Time-Series Benchmark Hygiene before treating leaderboard rank as settled.

Open Questions

  • Can the functional basis decoder be reused as an output head for point-wise time-series embeddings with known future exogenous variables?
  • Which basis should be preferred for mixed trend, seasonality, and regime-shift signals: Legendre, Fourier, learned bases, or hybrids?
  • How should coefficient-space forecasts expose uncertainty and interventions if the model is extended toward action-conditioned world modeling?