Eidos: Latent-Space Predictive Learning For Time Series Foundation Models
Source
- Raw Markdown: paper_eidos-2026.md
- PDF: paper_eidos-2026.pdf
Core Claim
Eidos shifts time-series foundation-model pretraining from direct future-value prediction to latent-space predictive learning with observation-space grounding.
Key Contributions
- Trains a causal Transformer to predict the evolution of latent representations.
- Uses a lightweight aggregation branch to construct stable target representations.
- Combines latent alignment, grounding, and forecasting supervision in one objective.
- Reports robust performance and improved latent organization on GIFT-Eval-style benchmarks.
Method Notes
Eidos is the main source for Latent-Space Predictive Learning in the time-series cluster and is also linked to Time-Series Foundation Models.
Evidence And Results
The source emphasizes reduced structural fragmentation, noise robustness, feature probing, latent steering, and competitive zero-shot forecasting.
Limitations
The current evidence is forecasting-centered. It should be compared with reasoning-focused TimeOmni-1 and generation-focused TimeOmni-VL.
Links Into The Wiki
Open Questions
- Can Eidos-style latent predictive learning support causal or language-based reasoning?
- How should observation grounding be balanced against latent abstraction?