Moirai

Summary

Moirai is Salesforce’s Uni2TS forecasting model family. In this wiki it covers the original masked-encoder universal forecaster, the Moirai-MoE sparse expert branch, and the simplified decoder-only Moirai 2.0 line.

Lineage

  • Moirai introduces the original universal forecasting Transformer with LOTSA pretraining, Any-variate Attention, multi-patch-size projections, and mixture distribution heads.
  • Moirai-MoE adds sparse token-level expert routing to replace frequency-specific projections with learned specialization.
  • Moirai 2.0 simplifies the family into a decoder-only quantile forecaster and reports that the small released model is stronger than larger variants under the paper’s aggregate.

Official Artifacts

Role In The Wiki

Moirai is a compact example of how one model family can test several architecture hypotheses: universal masked-encoder forecasting, sparse expert routing, and a smaller decoder-only quantile interface. It is also a benchmark-hygiene reminder because the family mixes multivariate/covariate support in early versions with a simpler univariate interface in Moirai 2.0.