🤖 AI Summary
Google Research, with collaborators at HHMI Janelia and Harvard, released ZAPBench: a first-of-its-kind, multimodal benchmark dataset that records whole-brain activity and will be paired with nanoscale structural mapping for a single larval zebrafish. Using a light-sheet microscope and exploiting the transparency of newly hatched zebrafish, the team captured nearly two hours of neural activity across more than 70,000 neurons as the live fish responded to visual stimuli. The dataset makes it possible to train and rigorously evaluate models that predict time-varying brain activity—something that has been hard to standardize because whole-brain functional recordings are rare.
ZAPBench bridges connectomics (static, ultrastructural maps of neurons and synapses) with functional imaging, enabling researchers to test how well AI models link structure to dynamics. Early results show models vary in accuracy across brain regions, revealing systematic failure modes that can guide model and experiment design. By providing open, high-resolution ground truth for both activity and (eventually) the exact specimen’s connectome, this resource could accelerate mechanistic models of brain function with downstream implications for neuroscience, brain–computer interfaces, and medical applications—similar to how benchmarks like WeatherBench unified progress in other ML domains.
Loading comments...
login to comment
loading comments...
no comments yet