🤖 AI Summary
Dell has launched a Linux-only variant of the Pro Max 16 Plus configured as a mobile AI workstation that ships with two soldered Qualcomm Cloud AI 100 (AIC100) accelerators. Each AIC100 has 16 VLIW compute cores and 144 MB of SRAM; the two chips present as a single 32‑core accelerator and share 64 GB of dedicated LPDDR5X (advertised ~272 GB/s). Qualcomm positions the AIC100 for inference: up to 400 TFLOPS FP16, support for models “up to 120B parameters,” and a combined maximum power draw around 150 W. Dell pairs the accelerators with Intel Arrow Lake‑HX CPUs, Ubuntu 24.04 LTS and a typical workstation spec sheet (16" 1920×1200 display, 64–128 GB system RAM, 1–4 TB SSD, Thunderbolt 5/4, Wi‑Fi 7). Prices start near €7,040 and go past €9,500.
For the AI/ML community this is notable as a portable, vendor‑integrated inference dev kit that swaps Nvidia GPUs for Qualcomm’s ASICs, highlighting Qualcomm’s move from datacenter silicon into developer hardware. Key implications: the AIC100 is optimized for FP16 inference workloads and requires Linux/containerized environments (Cloud AI SDK currently Linux‑only), which explains the Intel + Ubuntu choice and limits immediate Windows/ARM adoption. The dedicated accelerator memory and VLIW architecture mean developers will need to adapt toolchains and runtime integration, but gain a compact, lower‑latency option for on‑device model serving and experimentation.
Loading comments...
login to comment
loading comments...
no comments yet