This Startup Wants to Put Its Brain-Computer Interface in the Apple Vision Pro (www.wired.com)

🤖 AI Summary
Startup Cognixion announced a clinical trial that integrates its noninvasive brain-computer interface (BCI) with the Apple Vision Pro to help people with paralysis and speech impairments communicate. The trial (up to 10 U.S. participants) pairs a custom EEG headband—embedded with six sensors over the visual and parietal cortex—with a Vision Pro app and an external neural computing pack worn at the hip. The system decodes visual fixation signals (when a user maintains gaze on an object) to select menu items and feeds those signals into a personalized generative-AI communication model trained on the individual’s speech history, style, and writing. Cognixion previously tested its Axon‑R headset with ALS patients and reported conversation-like speeds using a similar AI-assisted pipeline. This is significant because it pursues a scalable, lower-risk alternative to invasive implants (e.g., Neuralink or Synchron) by leveraging Apple’s new BCI protocol and AR platform to reach more users faster. Key technical implications: noninvasive EEG is noisier and provides weaker signals than implants, so Cognixion relies heavily on AI to amplify decoding accuracy and user experience; a larger pivotal trial (~30 patients) and FDA clearance will be required to prove efficacy and usability. If successful, the approach could democratize assistive BCIs—but experts caution that signal quality remains the main barrier and AI copilots will be critical to closing the performance gap with implanted systems.
Loading comments...
loading comments...