ChatGPT prompt consumes equivalent to 10s of Netflix (simonwillison.net)

🤖 AI Summary
In June 2025 Sam Altman said the “average” ChatGPT query uses about 0.34 watt‑hours (Wh) of electricity. Using the International Energy Agency’s 2019 Netflix streaming estimate (0.12–0.24 kWh per hour, i.e., 120–240 Wh/h), that single prompt is roughly equivalent to 5.1–10.2 seconds of Netflix: 0.34 Wh ÷ (240 Wh/3600 s) ≈ 5.1 s at the high end, and about double if you use the lower streaming estimate. The raw math makes the abstract 0.34 Wh figure concrete and shows per‑inference energy is small compared with an hour of video streaming. That comparison is a useful communication tool for the AI/ML community but it’s not the whole story: Altman’s number refers to inference cost only, not to model training (which can be orders of magnitude higher), nor to data‑center buildout, networking, end‑user device energy, or regional carbon intensity. For practitioners and researchers, per‑prompt energy metrics help benchmark model efficiency, guide architecture and latency/power tradeoffs, and inform deployment decisions and sustainability claims—but they must be paired with lifecycle accounting (training, replication, PUE, carbon intensity) to assess true environmental impact.
Loading comments...
loading comments...