🤖 AI Summary
Google Research’s ARIMA_PLUS is a new, in-database framework for large-scale time series forecasting and anomaly detection that combines interpretable statistical modeling with cloud-scale execution in BigQuery. The paper demonstrates a modular, sequential model that explicitly decomposes series into holiday effects, seasonality, trend and anomalies, making outputs easy to inspect and act on. On accuracy, ARIMA_PLUS outperforms classical statistical methods (ETS, ARIMA, TBATS, Prophet) and recent neural models (DeepAR, N-BEATS, PatchTST, TimeMixer) across 42 benchmark datasets from the Monash repository.
Where ARIMA_PLUS stands out is operational scale and usability: it’s implemented directly inside BigQuery’s query engine, exposed via a simple SQL interface that automates data cleaning, model selection and tuning, and scales with managed cloud compute and storage. The system can forecast ~100 million time series in ~1.5 hours (≈18,000 series/sec), and the unified framework supports both forecasting and anomaly detection while preserving interpretability for business users. For teams needing production-ready, explainable forecasting at massive scale, ARIMA_PLUS bridges the gap between traditional, transparent models and modern cloud-native deployments.
Loading comments...
login to comment
loading comments...
no comments yet