🤖 AI Summary
Google published the first official per-query energy figure for its Gemini text model: a median text prompt uses about 0.24 Wh of data-center electricity. That lets reporters compare AI prompts against familiar activities — one prompt is roughly 1.5% of an iPhone 17 charge or under 10 seconds of streaming on a 55" TV — and shows that most household energy for streaming is consumed by the end device (TV/laptop/phone), not the data center. On an individual level this is tiny: an average user sending 10–20 prompts/day consumes ~3.6 Wh (≈0.03% of daily household electricity), while a heavy user at 50 prompts/day reaches ~0.15% (similar to a TV in standby). Google also estimates 0.26 ml of water and ~0.03 g CO2e per prompt.
The nuance: the figure covers only text-generation inside data centers (no image/video numbers) and is a median, so workloads can be higher. While a single prompt is a “drop in the bucket,” scale matters — billions of daily prompts (OpenAI handles ~2.5B/day) could amplify impacts on power grids and cooling needs. Notably, when comparing only data-center energy, AI prompts are more compute-intensive than typical streaming tasks (e.g., one prompt ≈ 3.3 seconds of cloud gaming). The data gives a useful baseline for efficiency improvements, carbon accounting and infrastructure planning as use grows.
Loading comments...
login to comment
loading comments...
no comments yet