🤖 AI Summary
Samsung has announced the commercial shipment of its new high bandwidth memory (HBM4), a significant advancement designed to enhance speed and efficiency for AI applications in data centers. HBM4 delivers impressive transfer rates of 11.7Gbps, with potential configurations reaching up to 13Gbps, and boasts a total memory bandwidth of 3.3TB/s per stack. Produced using Samsung’s sixth generation 10nm-class DRAM process combined with a 4nm logic base die, this first-of-its-kind memory is expected to address the increasing demands for higher performance in AI and machine learning applications.
The technical improvements in HBM4 also include a notable 40% increase in power efficiency compared to its predecessor, HBM3E, thanks to advanced voltage management and thermal enhancements. With capacity options ranging from 24GB to potentially 48GB in stacked configurations, Samsung is positioning itself strategically as it anticipates significant growth in the HBM market. This move not only strengthens Samsung's competitive edge against rivals but also highlights its commitment to innovation in memory technology, paving the way for faster and more efficient AI systems.
Loading comments...
login to comment
loading comments...
no comments yet