A brain-network renders subjective experience from multiple predictive modules (www.nature.com)

🤖 AI Summary
Researchers used fMRI while participants watched a short movie (n=111) — with separate behavioral crowdsourcing to generate continuous “belief-update” timecourses — to show that the midline prefrontal cortex implements three parallel, domain-specific predictive models. Ventromedial PFC tracked contextual “State” predictions, anteromedial PFC encoded social reference-frame “Agent” predictions, and dorsomedial PFC handled temporally abstract “Action” transitions. Moments of prediction-error–driven neural change in each region aligned with domain-specific subjective belief updates. The team combined BOLD-regression against update probabilities, thresholded event binarization, Hidden Markov modeling of neural-state transitions, and inter-subject correlation analyses; key results replicated in an independent spoken-story dataset (n=52), showing modality-general effects. Crucially, these separate PFC predictions are funneled and selectively integrated with sensory streams in the Precuneus, suggesting a two-stage architecture: specialized, modular model inference in PFC and global integration in a DMN hub to produce coherent subjective experience. The work supports modular cognitive maps (States, Agents, Actions) over a single unified internal model and links prediction-error dynamics to conscious belief changes. For AI/ML, the findings bolster modular architectures with domain-appropriate inductive biases plus a central integrator — a design that could improve generalization, compositionality, and flexible planning in artificial agents.
Loading comments...
loading comments...