🤖 AI Summary
Apple has paused work on a cheaper, lighter “Vision Air” variant of the $3,499 Vision Pro and is redeploying engineers to accelerate development of AI-powered smart glasses, Bloomberg reports. The first glasses model — potentially announced as soon as next year and launched by 2027 — will include cameras, microphones, health sensors, an Apple-designed chip, and voice/AI features, but no integrated display. A display-equipped version is still planned (previously targeted for 2028) but Apple is fast-tracking that timeline after Meta’s recent Ray‑Ban Display announcement. Meanwhile, Apple will still refresh the current Vision Pro with an M5 chip later this year.
This shift signals Apple’s urgency to compete with Meta’s evolving Ray-Ban lineup and its broader “Orion” AR roadmap. For the AI/ML community the move highlights two trends: a pivot from high-end spatial computing toward lightweight, always-on AI wearables, and greater emphasis on conversational/voice-driven intelligence (dependent on a next‑gen Siri expected in spring 2026). Technically, Apple’s glasses will blend on-device hardware (custom silicon) with phone tethering and cloud/AI services, raising questions about latency, privacy, and the on-/off‑device split for inference and model updates. The plan to offer multiple frame styles also shows Apple treating glasses as a mainstream consumer product rather than a niche developer kit.
Loading comments...
login to comment
loading comments...
no comments yet