Micron Humming Along on All Memory Cylinders (www.nextplatform.com)

🤖 AI Summary
Micron posted a blockbuster quarter as AI-driven demand for high-bandwidth memory (HBM) and datacenter upgrades lifted revenues 46% year‑over‑year to $11.32B and drove operating income to $3.65B (2.4x) with net income surging to $3.2B (28.3% margin). DRAM sales soared 67.8% to $8.94B while NAND softened slightly; Micron’s Cloud Memory unit (HBM, CXL, DDR, datacenter flash) grew 3.1x to $4.53B with a 48% operating margin. The company expects continued supply tightness for DRAM in 2026, is planning roughly $18B in FY2026 capex, and guided Q1 FY2026 sales to ~$12.5B. For the AI/ML community, the key takeaway is that a U.S.-based HBM supplier is now a major, reliable source of stacked memory at a time when GPU/ XPU fleets (for GenAI training and inference) are consuming vast HBM capacity—reducing geopolitical supply risk. Technically, Micron has become a lead supplier of HBM3E (it supplied Nvidia’s H200 HBM3E and has most 2026 HBM3E capacity pre-sold) and is advancing HBM4/HBM4E. Micron boosted HBM4 pin speeds to 11 Gb/s (about 2.8 TB/s per stack) and supports 8‑high and 12‑high stacks; HBM4E will add base-die chiplet options enabling custom on‑memory functionality. Micron estimates ~20% HBM market share by Q3 FY2026 and sees the HBM market expanding toward a $100B opportunity by 2030—a critical capacity expansion for large-scale AI infrastructure.
Loading comments...
loading comments...