Moirai
Summary
Moirai is Salesforce’s Uni2TS forecasting model family. In this wiki it covers the original masked-encoder universal forecaster, the Moirai-MoE sparse expert branch, and the simplified decoder-only Moirai 2.0 line.
Lineage
- Moirai introduces the original universal forecasting Transformer with LOTSA pretraining, Any-variate Attention, multi-patch-size projections, and mixture distribution heads.
- Moirai-MoE adds sparse token-level expert routing to replace frequency-specific projections with learned specialization.
- Moirai 2.0 simplifies the family into a decoder-only quantile forecaster and reports that the small released model is stronger than larger variants under the paper’s aggregate.
Official Artifacts
- Uni2TS source: https://github.com/SalesforceAIResearch/uni2ts
- Moirai 1.0 and 1.1 checkpoints: https://huggingface.co/Salesforce
- Moirai-MoE checkpoints: https://huggingface.co/Salesforce/moirai-moe-1.0-R-small and https://huggingface.co/Salesforce/moirai-moe-1.0-R-base
- Moirai 2.0 small checkpoint: https://huggingface.co/Salesforce/moirai-2.0-R-small
Role In The Wiki
Moirai is a compact example of how one model family can test several architecture hypotheses: universal masked-encoder forecasting, sparse expert routing, and a smaller decoder-only quantile interface. It is also a benchmark-hygiene reminder because the family mixes multivariate/covariate support in early versions with a simpler univariate interface in Moirai 2.0.