<RETURN_TO_BASE

Moirai 2.0: Salesforce’s Faster, Smaller Time-Series Foundation Model Tops GIFT-Eval

'Salesforce AI Research released Moirai 2.0, a decoder-only transformer time-series foundation model that tops GIFT-Eval while offering 44% faster inference and a 96% reduction in size. The model brings practical improvements for enterprise forecasting across IT operations, sales, demand, and supply chain use cases.'

What Moirai 2.0 Brings

Salesforce AI Research has released Moirai 2.0, a next-generation time-series foundation model built on a decoder-only transformer. The update focuses on real-world efficiency and scalability: it ranks first on the GIFT-Eval benchmark for time-series forecasting while delivering dramatically smaller model size and faster inference than its predecessor.

Architecture and Efficiency Gains

Decoder-only Transformer

Moirai 2.0 replaces a masked encoder approach with a decoder-only transformer. This architecture better supports autoregressive forecast generation and scales more efficiently across large and complex datasets.

Efficient Multi-Token Prediction

Instead of generating one token at a time, Moirai 2.0 predicts multiple tokens in each step, improving inference throughput and stability during forecasting.

Robustness Techniques

The model pipeline includes advanced data filtering that removes low-quality or non-forecastable time series during training. Additional techniques like patch token embedding and random masking encode missing-value information and increase robustness to incomplete inputs.

Richer Pretraining Mix

Moirai 2.0 trains on a diverse combination of sources to improve generalization:

  • Real-world sets such as GIFT-Eval Pretrain and Train
  • Synthetic augmentation via Chronos mixup
  • KernelSynth procedures originating from Chronos research
  • Internal operational data from Salesforce IT systems

This varied dataset foundation helps the model generalize to many forecasting tasks and domains.

Performance Highlights

Moirai 2.0 sets new standards compared to earlier versions and competing models:

  • Achieves the best MASE score on GIFT-Eval among non-data-leaking models
  • Matches previous state-of-the-art on CRPS
  • Versus Moirai_large: 16% better MASE, 13% better CRPS, 44% faster inference, 96% smaller parameter size

These improvements make high-performance forecasting more practical and accessible for both research and production deployments.

Practical Use Cases

The model's improvements are directly relevant to enterprise scenarios, including:

  • IT operations: capacity planning and anomaly detection
  • Sales forecasting: scalable revenue predictions
  • Demand forecasting: inventory optimization
  • Supply chain planning: better scheduling and waste reduction

Because Moirai 2.0 is both faster and much smaller, it enables high-quality forecasting at scale across varied infrastructure setups.

Getting Started with Moirai 2.0

Integration is straightforward for data scientists and developers. Salesforce provides examples and modules on Hugging Face and GitHub. A typical Python workflow uses the provided modules to load the model, prepare a dataset, and generate forecasts.

Sample Python Workflow

Import Libraries

import matplotlib.pyplot as plt
from gluonts.dataset.repository import dataset_recipes
from uni2ts.eval_util.data import get_gluonts_test_dataset
from uni2ts.model.moirai2 import Moirai2Forecast, Moirai2Module

Load Moirai 2.0

model = Moirai2Forecast(
    module=Moirai2Module.from_pretrained("Salesforce/moirai-2.0-R-small"),
    prediction_length=100,
    context_length=1680,
    target_dim=1,
    feat_dynamic_real_dim=0,
    past_feat_dynamic_real_dim=0
)

Load Dataset & Generate Forecasts

test_data, metadata = get_gluonts_test_dataset("electricity", prediction_length=None, regenerate=False)
predictor = model.create_predictor(batch_size=32)
forecasts = predictor.predict(test_data.input)

Visualize Results

# Example visualization
fig, axes = plt.subplots(nrows=2, ncols=3, figsize=(25, 10))
# Use Moirai plotting utility to display forecasts

Full examples, notebooks, and tutorials are available from Salesforce's resources on Hugging Face and GitHub.

Why This Matters

By combining a decoder-only transformer, multi-token prediction, improved data filtering, and a richer pretraining mix, Moirai 2.0 offers a practical, high-performance forecasting foundation. Its speed and compact size lower the barrier to deploying accurate forecasts across industries and operational environments.

🇷🇺

Сменить язык

Читать эту статью на русском

Переключить на Русский