Anthropic signs multi-billion dollar Google deal that gives it access to a million TPUs (www.techradar.com)

🤖 AI Summary
Anthropic has signed a multibillion-dollar agreement with Google Cloud to access up to one million Google Cloud TPUs for training and running its Claude family of large language models. Anthropic describes the deal as being worth “tens of billions” and says the added capacity could push it to well over a gigawatt of compute by 2026. The company will continue a multi‑vendor strategy—keeping Nvidia GPUs and Amazon Trainium in its stack—and says Amazon remains its primary training partner and cloud provider even as it expands TPU usage. Google highlights continued TPU innovation, including its seventh‑generation “Ironwood” accelerators, as part of the rationale. For the AI/ML community this matters because it signals a major shift in raw training and inference supply: large TPU allocations at scale can lower cost-per‑token and energy per compute unit compared with some GPU setups, potentially changing economics for LLM development and deployment. The deal strengthens Google Cloud’s challenge to Nvidia’s ecosystem and underscores the industry trend toward diversified accelerator sourcing. Practically, Anthropic’s move could accelerate model scaling, reduce operational costs and energy intensity, and influence architectural and deployment choices across LLM providers and cloud vendors.
Loading comments...
loading comments...