🤖 AI Summary
A new analysis suggests that large language models (LLMs) consume significantly less mobile energy than traditional ad-supported web searches, with results indicating that LLMs use 5.4 times less energy per session. This finding contrasts with a previous claim that LLM responses required ten times more energy than Google search queries, which only considered the energy consumed by server-side computations. By shifting the focus to the full user experience—from network transmission to client device power consumption—the study reveals that the overhead associated with downloading and rendering ad-heavy web pages greatly increases total energy use for search sessions.
This research is significant for the AI/ML community as it challenges the prevailing narrative about LLM energy efficiency and highlights the broader environmental impact of programmatic advertising in web searches. The study used empirical data, including Google's recent findings that median energy consumption for a text prompt is around 0.24 Wh, compared to the network energy alone for loading a typical mobile web page, which is approximately 0.44 Wh. By providing a comprehensive lifecycle assessment, the paper emphasizes the need to consider the entire energy ecosystem when evaluating the sustainability of AI technologies against conventional web search practices, potentially shaping future regulatory frameworks and ESG discussions.
Loading comments...
login to comment
loading comments...
no comments yet