Nvidia says its GPUs are a 'generation ahead' of Google's AI chips (www.cnbc.com)

🤖 AI Summary
Nvidia pushed back after a report that Meta might deploy Google’s tensor processing units (TPUs) in its data centers, telling investors its GPUs remain “a generation ahead.” The company’s stock dipped about 3% on the news; Nvidia responded on X and in earnings commentary by emphasizing its Blackwell GPU generation’s performance, versatility and “fungibility” compared with ASIC-style chips like Google’s TPUs. Nvidia highlighted that it still supplies Google and argued GPUs can run every AI model everywhere, while TPUs are primarily Google’s in‑house accelerators offered to others only via Google Cloud. The spat matters because Nvidia controls roughly 90% of the AI accelerator market, but Google’s in‑house TPUs have gained attention after Google trained and released Gemini 3 on TPUs and started marketing TPU access through cloud services. Technical implications center on tradeoffs between general-purpose GPUs (flexible for many models and workloads) and ASIC-like TPUs (specialized, potentially more cost- or energy‑efficient for certain models). Nvidia also reiterated that “scaling laws” — the idea that more chips and data yield better models — will continue to drive demand for its systems, framing competition with Google as complementary rather than existential.
Loading comments...
loading comments...