Google's decade-long bet on custom chips is turning into company's secret weapon in AI race (www.cnbc.com)

🤖 AI Summary
Google revealed that its seventh‑generation custom chip, Ironwood — a Tensor Processing Unit (TPU) ASIC built over a decade — will be broadly available in coming weeks. Google says Ironwood is designed for the heaviest AI workloads (large‑model training, real‑time chatbots and agents) and delivers more than four times the performance of the prior TPU generation. The chips are offered as cloud services (not sold as hardware); early customers include Anthropic, which plans to use up to 1 million Ironwood TPUs and expects Google to bring well over a gigawatt of AI compute capacity online in 2026. The TPU rollout coincides with strong cloud growth (Q3 revenue +34% to $15.15B) and massive multi‑year deals with Anthropic and Meta, positioning TPUs as a core driver of Google Cloud momentum. For the AI/ML community the implications are twofold: performance and supply. TPUs give Google a distinct cost‑performance and energy efficiency advantage for targeted AI workloads at hyperscale, helping mitigate chip shortages and rising power constraints that many expect will be the next bottleneck. While Nvidia remains the dominant GPU supplier, Google’s large‑scale ASIC deployment — plus competitor chips from AWS (Inferentia/Trainium) and Microsoft (Maia) — signals an industry shift toward custom silicon plus multi‑chip strategies to optimize cost, redundancy and latency. Ambitious moves like Project Suncatcher (solar‑powered satellites with TPUs) and rising CapEx underline Google’s bet that bespoke hardware will be a secret weapon in the AI arms race.
Loading comments...
loading comments...