🤖 AI Summary
Researchers published a study in Nature Communications Medicine introducing SeeMe, a computer-vision tool that detects imperceptible, stimulus-evoked facial movements in patients with severe brain injuries—signaling covert consciousness days or weeks earlier than standard bedside exams. In 37 coma patients (and 16 healthy controls), cameras recorded faces while recorded commands (“Open your eyes,” “Stick out your tongue,” “Show me a smile”) were played. SeeMe tracks thousands of microscopic facial landmarks (down to individual pores), builds a high-resolution vector map of motion, and compares baseline to post-command activity. Across sessions SeeMe identified an average of 5.4 responses out of 10 commands versus 2.8 for blinded human raters, detected eye-opening ~4.1 days earlier and mouth command-following ~8.3 days earlier than clinicians, and its deep neural classifier could predict which command was given from the facial motion pattern with 65% accuracy—supporting that movements were intentional, not random.
Technically lightweight and suitable for continuous bedside monitoring, SeeMe correlated movement magnitude and frequency with better functional outcomes at discharge, suggesting prognostic value. The authors caution it’s a complement, not a replacement, because sedation, obstructing equipment, and arousal fluctuations can confound detection; larger trials and integration with EMG/EEG are planned. If validated, this low-cost, noninvasive approach could reduce misdiagnosis of covert consciousness, enable earlier intervention, and reshape neurocritical monitoring and rehabilitation decision-making.
Loading comments...
login to comment
loading comments...
no comments yet