🤖 AI Summary
Phison CEO Pua Khein Seng has emphasized that the real bottleneck in AI models is not computing power but memory, a critical constraint that affects both local inference on devices and massive AI data centers. In an insightful interview, Pua highlighted that the industry's focus on GPUs has overlooked the need for adequate memory, particularly as many systems suffer from inadequate DRAM, often crashing under load. To address this, Phison is advancing its aiDAPTIV+ technology, which allows SSDs to function as a memory pool, thereby enhancing responsiveness during AI inference processes. This approach could significantly lower the Time to First Token (TTFT), making AI interactions smoother and more user-friendly.
Additionally, Phison is pushing the boundaries of enterprise storage with its new 244TB SSDs. Pua linked the profitability of cloud service providers directly to storage capacity, arguing that while substantial investments have gone into GPUs, actual earnings rely on data storage and efficient inference. By utilizing high-capacity NAND flash to expand memory options, Phison aims to reduce the need for multiple GPUs being purchased merely for their VRAM, allowing companies to scale computational power more effectively. This vision shifts the focus from simply increasing GPU size and power to creating a memory-centric AI infrastructure that prioritizes scalable and modular storage solutions.
Loading comments...
login to comment
loading comments...
no comments yet