OpenAI's 250GW data center target would match India (www.tomshardware.com)

🤖 AI Summary
OpenAI’s CEO reportedly outlined plans in a September 2025 memo to scale compute to as much as 250 gigawatts by 2033 — a level of electricity roughly equivalent to powering India’s 1.5 billion residents. Independent analysis translates that capacity into about 60 million Nvidia GB300-class GPUs in operation (implying roughly 30 million card purchases a year if economic replacement cycles are used as a proxy), and projects lifecycle and energy use that could emit CO2 at rates comparable to — or exceeding — current top non-state emitters. Competing players are planning at similar scale (xAI has cited a 50M H100-equivalent target needing ~5 GW), so this is an industry-wide buildout, not a single-company anomaly. That ambition has immediate technical and policy implications: massive stress on electrical grids (price spikes, reduced power quality), huge water demand for cooling, expanded semiconductor fab construction (97 new fabs started recently) and associated high power and water footprints — for example TSMC’s Fab 25 is estimated at ~1 GW and ~100,000 metric tons of water per day — plus toxic chemical use and worker-health concerns. The story highlights that scaling AI is as much a physical-infrastructure and supply-chain problem as an algorithmic one, raising urgent needs for energy efficiency, renewables, novel cooling, circular GPU supply chains, stricter environmental regulation, and geopolitical management of critical materials.
Loading comments...
loading comments...