Cosmic simulations that once needed supercomputers now run on a laptop (www.sciencedaily.com)

🤖 AI Summary
Researchers have released Effort.jl, a fast, differentiable emulator that reproduces predictions from the Effective Field Theory of Large-Scale Structure (EFTofLSS) with near-identical accuracy while cutting runtime from supercomputer days to minutes on a standard laptop. Built around a neural network trained on model outputs, Effort.jl embeds known parameter-response behavior and uses gradients (derivatives of predictions with respect to parameters) during training so it needs far fewer examples to learn. The team shows the emulator matches the original model on both simulated and real data, and in some cases even permits inclusion of smaller-scale information that analysts previously had to discard for speed. The technical novelty is twofold: physics-informed training (encoding how predictions change with parameters) and differentiability, which enables efficient, gradient-aware parameter inference (speeding likelihood evaluations, MCMC, or optimization). That makes EFTofLSS analyses tractable for routine use on modest hardware and scales to the large datasets incoming from DESI, Euclid and similar surveys. Published in JCAP by Bonici et al., Effort.jl promises to democratize high-precision cosmological inference, accelerate iterative model development, and allow more exhaustive exploitation of forthcoming survey data without prohibitive compute costs.
Loading comments...
loading comments...