🤖 AI Summary
            TechRadar’s deep dive quantifies AI’s footprint and puts it in context: while individual prompts are tiny, aggregate demand is meaningful. OpenAI handles ~2.5 billion prompts a day, and AI accounts for roughly 10–20% of global data-center electricity use (the sector uses about 415 TWh/year). Data centers split power roughly 50% enterprise/government, ~15% streaming video and tiny shares for cloud photos; AI is currently a small slice of total global electricity but a growing one. Data-center power demand is forecast to double by 2030, with AI driving a large portion of that growth, even as other sectors (notably EVs and 5G) scale past it.
The analysis also breaks out water and carbon: data-center cooling uses around 100 GL/year—comparable to watering golf courses in the rain—and a 500 ml water bottle could cool ~2,000 prompts. Emissions are about 0.03 g CO2e per prompt, equating to roughly 0.07% of global CO2 from AI power use—comparable to a small country like Denmark or the ridesharing sector. The headline is nuanced: AI’s per-prompt impact is minuscule, but concentrated data-center growth creates significant local environmental pressures (water, ecosystems, grid load), making infrastructure planning, efficient cooling, and clean-energy sourcing crucial for responsible AI scaling.
        
            Loading comments...
        
        
        
        
        
            login to comment
        
        
        
        
        
        
        
        loading comments...
        no comments yet