Why Nvidia is worth $5 trillion: Inside a $35 billion, 1 gigawatt AI data center. (www.businessinsider.com)

🤖 AI Summary
Wall Street analysts now measure next‑generation AI rigs by gigawatts, and new estimates show why that matters: Bernstein pegs the build cost of a 1 GW AI data center at roughly $35 billion. That scale — about the output of a small nuclear reactor — underpins projects like xAI’s Colossus 2, Meta’s Prometheus, OpenAI’s Stargate and Amazon’s Mount Rainier. GPUs are the single biggest line item (≈39% of spend), with TD Cowen estimating each GW requires over 1 million GPU dies; given Nvidia’s ~70% gross margins, analysts calculate Nvidia captures nearly 30% of AI data center spending as profit, a core reason for its near-$5 trillion valuation. The $35B is an industrial ecosystem: networking (≈13%) favors Arista, Broadcom and Marvell; power distribution is nearly 10%, with Eaton, Schneider, ABB and Vertiv in play; thermal (≈4%) calls for liquid/air cooling tech; real estate ~10%; and running one GW costs about $1.3B/year in electricity while requiring surprisingly few staff. Foundry revenue (TSMC ≈ $1.3B/GW) and optics/cable suppliers (InnoLight, Coherent, Amphenol) also benefit. While hyperscalers develop custom AI ASICs to cut costs, GPUs remain the economic center of gravity, and the new bottleneck is reliable grid and generation capacity, driving demand for Siemens Energy, GE Vernova and Mitsubishi Heavy.
Loading comments...
loading comments...