🤖 AI Summary
Global DRAM prices have surged in 2025, driven by booming AI data-center demand and constrained supply. DDR5 retail kits have more than doubled since mid-2025, DRAM contract prices were up 171.8% year-over-year in Q3, and examples include a 32GB DDR5 kit jumping from $110 to $442. Cloud providers report paying up to 50% premiums, suppliers often fulfill only ~70% of orders, and retailers are rationing modules. The squeeze extends to NAND flash and HDDs—forcing cloud firms to shift to SSDs—and analysts estimate DRAM-driven server costs could rise 10–25% for hyperscalers. Large buys are jaw-dropping: reported deals (e.g., OpenAI’s “Stargate”) for up to 900,000 DRAM wafers/month would approach ~40% of global output if realized, while HBM and next-gen NAND capacity is largely prebooked through 2026.
But the industry structure complicates the “just AI demand” story. Production is concentrated among Samsung, SK Hynix and Micron, who diverted capacity into high-margin HBM for accelerators, reducing commodity DRAM supply and prolonging tightness. Given past guilty pleas and lawsuits over DRAM price-fixing, skepticism about tacit coordination persists—firms benefit from high prices and are cautious about expanding capacity amid fears of an AI bubble. The upshot for the AI/ML community: higher deployment costs, potential slowdowns in data-center rollouts if memory remains scarce, and elevated risk of a future boom‑and‑bust cycle if supply ramps or demand softens.
Loading comments...
login to comment
loading comments...
no comments yet