Movies, music and pictures can train synthetic brain

A new AI-based technology developed by Cornell researchers will help gain new insights into how our brains respond to external stimuli.

Meenakshi Khosla, doctoral student in the field of electrical and computer engineering, along with her adviser, Mert Sabuncu, associate professor of electrical and computer engineering in the College of Engineering and colleagues at Weill Cornell Medicine, are authors of a new paper published in Science Advances, “Cortical response to naturalistic stimuli is largely predictable with deep neural networks.”

“Major discoveries in the field of sensory neuroscience have been driven by controlled experiments that present animals with carefully designed artificial visual and auditory stimuli,” Khosla said. “In this study, we present an alternate way to expedite neuroscientific discovery.”

This alternate method involves collecting neural responses to rich, complex stimuli that mimic natural conditions, namely movies, and training computational models with different inductive biases to predict the evoked response. The study uses neurological data from the Human Connectome Project database, collected while subjects were passively watching movies including clips taken from commercial films including “Home Alone,” “Star Wars” and “Inception.”

“We developed an AI-based technology that can reliably predict the activation patterns across the brain of a person watching a movie, listening to an audio track or looking at a picture,” said Sabuncu, also an assistant professor of electrical engineering research in radiology at Weill Cornell Medicine. “We can view this technology as a synthetic brain that can allow researchers to gain new insights into how our brains respond to external stimuli.”

Such a synthetic brain can lead to novel neuroscientific findings and brain-computer-interface tools for assisting or augmenting cognitive or sensory and motor functions.

Amy Kuceyeski, an adjunct associate professor of computational biology in the College of Agriculture and Life Sciences and an associate professor of mathematics in radiology at Weill Cornell Medicine, is a co-author and collaborator in the work.

“Studying how the brain processes external stimuli is an important area of neuroscience research because it allows a window into how the brain works,” Kuceyeski said. “Knowing how the brain processes images and sound may provide ideas on how to mirror these mechanisms in artificial systems, such as computer vision.”

The researchers’ proposed approach can replicate findings from a large number of prior studies in neuroscience, spanning research on multisensory integration, cortical temporal hierarchy and functional selectivity within the brain.

Taken together, the paper states, the findings underscore the potential of neural encoding models as a powerful tool for studying brain function in ecologically valid conditions.

Eric Laine is a communication specialist with the School of Electrical and Computer Engineering in the College of Engineering.

 

Media Contact

Gillian Smith