Datadog scales time series foundation models to 2.5B parameters (huggingface.co)

🤖 AI Summary
Datadog has announced the release of Toto 2.0, a family of time series foundation models designed for multivariate forecasting, significantly scaling its capabilities up to 2.5 billion parameters. This new generation of models employs u-μP scaling and is trained on a unified recipe, with improved forecast quality directly correlating with the increase in parameter count. Toto 2.0 has set new records across three key forecasting benchmarks: BOOM, GIFT-Eval, and the contamination-resistant TIME benchmark, positioning Datadog’s models at the forefront of time series analysis tools. This development is crucial for the AI/ML community as it demonstrates a significant leap in the performance of foundation models in forecasting, particularly in observability contexts where accurate predictions are essential. Toto 2.0 offers features like zero-shot forecasting, multi-variate support, and probabilistic predictions, making it versatile for various applications. Notably, the model's architecture supports variable prediction horizons and allows users to select model sizes based on their accuracy and latency requirements. Developers can easily integrate Toto 2.0 into their workflows, with inference code available on GitHub, reinforcing Datadog’s commitment to advancing AI-driven observability solutions.
Loading comments...
loading comments...