🤖 AI Summary
Goldman Sachs warns that America’s lead in AI could be constrained not by chips or talent but by electricity: U.S. data centers already consume roughly 6% of national power and — driven by AI model training and inference demand — could account for about 11% by 2030. Analysts say U.S. peak spare generation has fallen from ~26% five years ago to ~19% today and could dip below a “critically tight” 15% threshold if growth continues. The U.S. still contains 44% of global data center capacity, but retiring coal plants, long project timelines, and a global shortage of gas turbines limit how quickly new supply can come online.
That tightening contrasts with China’s expansive energy build‑out: Goldman projects China will hold about 400 GW of effective spare capacity by 2030—more than three times the world’s projected data center power needs—having expanded renewables, gas, nuclear and coal after its 2021 energy crunch. The report highlights that power infrastructure is a strategic, slow-to-fix bottleneck that could tilt the AI race, especially as Chinese subsidies and plentiful power lower operating costs for local chipmakers and cloud providers. Industry leaders such as Nvidia’s Jensen Huang have echoed these concerns, underscoring that reliable, ample electricity may be as decisive as compute in determining who wins the next phase of AI development.
Loading comments...
login to comment
loading comments...
no comments yet