🤖 AI Summary
Alarms that generative AI will gobble up huge amounts of electricity are likely overstated. Historical parallels — from 1999 PC forecasts to 2007 data‑centre scares — show computing’s share of electricity has stayed roughly 1–2% and under 1% of global greenhouse‑gas emissions, even as compute power rose ~1,000×. Modern efficiency gains (smaller, lower‑power devices; cloud consolidation; specialized chips and model optimizations) and modest current AI spending — most organizations allocate only 5–10% of IT budgets to AI, and even a company like OpenAI’s reported ~$3B revenue vs ~$7B development cost would not translate into a systemic electricity shock — mean large increases in AI compute would likely be largely offset. Cryptocurrency mining (0.5–1% of world electricity) is a clearer, avoidable source of waste.
The technical implication is that faster AI growth does not automatically imply catastrophic power demand: a tenfold rise in AI expenditure by 2030 might double IT electricity use, but continued efficiency gains and shifts in workloads can largely neutralize that effect. Policy takeaways: don’t slow the transition to renewables based on overblown AI energy fears; target real high‑emission, hard‑to‑decarbonize sectors (cement ≈7% of emissions) and curb wasteful practices like crypto mining. Misleading projections persist because they serve political or ideological narratives and because tech fears make vivid stories — but the data suggest AI’s electricity footprint is manageable.
Loading comments...
login to comment
loading comments...
no comments yet