OpenAI hands AMD a big win in AI chip race. What does it mean for Nvidia? (www.cnbc.com)

🤖 AI Summary
OpenAI struck a major multiyear deal to buy 6 gigawatts of AI chips from AMD for inference workloads, plus warrants that could give OpenAI roughly 10% of AMD if conditions are met. AMD says each gigawatt represents “significant, double‑digit billions” in revenue, and the announcement sent AMD shares up ~30% while Nvidia dipped ~1.5%. OpenAI continues deep ties with Nvidia as well (Nvidia committed $100B to build 10 GW of capacity for OpenAI), and OpenAI is also working with Broadcom on custom chips — signaling a multi‑vendor strategy to massively expand its data‑center footprint. Technically and strategically, this matters because it separates inference (day‑to‑day model serving) and training (large-scale model optimization), historically dominated by Nvidia, and shows OpenAI is diversifying suppliers to meet explosive compute demand. AMD’s next‑gen silicon plus software enablement for inference could lower supply risk, increase capacity, and pressure pricing and feature rollout timelines across the ecosystem. For the AI/ML community, more competition at hyperscaler scale means faster provisioning of inference capacity, potentially broader hardware/software co‑optimization, and a shift in how models are deployed operationally — but it doesn’t erase Nvidia’s entrenched lead in high‑scale training workloads.
Loading comments...
loading comments...