🤖 AI Summary
Apple has quietly shifted priorities away from a cheaper, slimmer follow‑up to the $3,499 Vision Pro and is fast‑tracking AI‑enabled smart glasses projects that a Bloomberg report (via Mark Gurman) says could arrive in 2027, with a lens‑display model targeted for 2028. The move is a tacit admission that the Vision Pro—positioned as a premium developer-first headset—failed to gain mass traction because it’s expensive, bulky, socially isolating and short on compelling apps. Estimates put Apple’s Vision Pro development and production spend in the “tens of billions” (some sources suggest up to ~$33B) while sales remain well below 1 million units, implying returns far short of investment.
For the AI/ML community this pivot matters: it signals Apple is prioritizing lighter, phone‑replacing form factors that pair on‑device AI (calls, photos, real‑time translation, assistants) with everyday wearability—where Meta currently leads. Technically, the industry should expect a push toward energy‑efficient optics, miniaturized displays, low‑latency on‑device inference, and tighter phone‑ecosystem integration. Apple may later revisit premium XR headsets, but for now it’s betting on the XR spectrum model—cheaper, simpler glasses to drive adoption and a pathway for users to upgrade to higher‑end headsets as applications and hardware mature.
Loading comments...
login to comment
loading comments...
no comments yet