The murky economics of the data-centre investment boom (www.economist.com)

🤖 AI Summary
Consultancy McKinsey has forecast roughly $5.2 trillion in global capital spending over the next five years on chips, data centres and energy to support AI — a figure it may soon raise — and US announcements suggest a fevered rush to build generative‑AI infrastructure. That scale of investment is driving a data‑centre boom that looks eerily familiar to some observers, who compare it to the 1990s telecoms bubble: huge upfront spending, long asset lives and uncertain demand trajectories that could leave investors holding stranded capacity if economics don’t pan out. The murkiness stems from technical and commercial pressures: AI infrastructure is capital‑intensive (GPUs/TPUs, specialized accelerators, dense cooling and massive power feeds), sensitive to utilization rates, and vulnerable to rapid efficiency gains in hardware or model architectures that can collapse demand. Energy and location choices, regulatory constraints, and the concentration of demand among hyperscalers further complicate return profiles; co‑location firms risk thin margins while hyperscalers internalize most value. For the AI/ML community this boom matters because it will shape access to compute (cost and centralization), influence model scale and carbon footprint, and determine who controls the backbone of future innovation — a high‑stakes infrastructure race with significant technical, economic and environmental implications.
Loading comments...
loading comments...