🤖 AI Summary
Generative AI’s rapid growth is already reshaping energy and infrastructure: using public scraps and industry reports, analysts estimate ChatGPT alone (700 million weekly users, ~2.5 billion queries/day) could consume roughly 0.34 Wh per query (an OpenAI figure cited by Sam Altman), totaling about 850 MWh/day and nearly 1 trillion queries/year — energy equivalent to powering ~29,000 U.S. homes for a year. Broader industry estimates vary widely (some researchers put complex-query costs >20 Wh), but Schneider Electric’s analysis uses ~2.9 Wh/query to arrive at 15 TWh of generative-AI power draw in 2025 and a startling 347 TWh by 2030.
That projected surge has concrete infrastructure implications: closing the ~332 TWh gap to 2030 would require dozens of “Stargate-class” 1‑gigawatt data-center campuses (each ~8.76 TWh/year) — part of planned collaborations among major AI firms — and will place inference (real-time model serving) at the center of energy demand rather than training. The forecast underscores urgent trade-offs for the AI/ML community: buildout of massive data centers, greater strain on grids, and heightened scrutiny of model efficiency, agent architectures, and sourcing low-carbon power. At the same time, wide uncertainty in per-query costs and user-growth assumptions means outcomes could be much better or worse depending on efficiency gains, agent adoption, and infrastructure choices.
Loading comments...
login to comment
loading comments...
no comments yet