The FINANCIAL — Every moment of our awake lives, images fall onto our eyes and go through a series of processing steps in the brain to inform us about what is going on in the world. A team of researchers led by the University of Amsterdam’s Swammerdam Institute for Life Sciences finds that the time the brain needs to make a visual decision not only depends on the properties of the images being processed, but also on whether there are also relevant sounds or touches, according to University of Amsterdam.
Every moment of our awake lives, images fall onto our eyes and go through a series of processing steps in the brain to inform us about what is going on in the world. A team of researchers led by the University of Amsterdam’s Swammerdam Institute for Life Sciences finds that the time the brain needs to make a visual decision not only depends on the properties of the images being processed, but also on whether there are also relevant sounds or touches. Their work is now published in Nature Communications.
After images from the outside world reach our eyes, signals cascade through the visual system. Neurons across different brain areas recombine the signals to form a representation that is then used to understand the world and react to it. The primary visual cortex is the first stage in the cerebral cortex where visual information is processed. Previously, our understanding has been that this process is mainly determined by the complexity of the visual scene that needs to be processed. However, we also know that vision is just one of the senses and does not operate in isolation: there are sounds, touch, smell and other senses. The researchers therefore asked if the processing time required to make a perceptual decision was affected if also other senses needed to be monitored.
What we see is also what we hear and touch
‘We compared two sets of subjects.’ says Matthijs oude Lohuis, first author of the paper together with Jean Pie “One cohort of mice was trained to report what they saw and ignore what they heard or felt on their whiskers. The other cohort was trained to report what they saw, but also what they heard or felt.’ The researchers found that the activity of neurons in visual cortex varied based on whether subjects were only trained to report visual stimuli, or also other modalities. Brain signals indicating that visual stimuli had been detected took longer to appear if mice were also paying attention to sounds or touch. ‘We hypothesized that a longer duration of processing in the primary visual cortex would also indicate a longer involvement of this region in the transformation of sensory stimuli into appropriate motor responses’ says Umberto Olcese, a senior researcher in the team. To test this hypothesis, the researchers used a molecular tool called optogenetics, which enabled them to turn the visual cortex on or off by illuminating it with a highly specific laser beam.
As expected, activity in the visual cortex was necessary for detecting visual stimuli. Surprisingly, however, the time period during which visual cortex was necessary for detection depended on whether mice were also paying attention to other sensory modalities: a task requiring to monitor another sensory modality prolonged the time during which activity in the visual cortex was needed to detect a visual stimulus, independently from the features of the images to be processed, according to University of Amsterdam.
Towards an integrated view of processing in the brain
This study challenges our current understanding of how vision works. While we are used to thinking that visual processing occurs in isolation, the way the brain analyzes images hitting our eyes is influenced already in the early stages by other factors, including other sensory modalities and how we use sensory information. This multimodal view of sensory processing similarly underlies a theory that one of the team members (Prof. Cyriel Pennartz) previously proposed on brain mechanisms of perception and consciousness. This theory states that conscious processing is jointly shaped by multiple senses.
This work is a major step towards developing an integrated view of how different cortical regions jointly process the diverse and multimodal information which is constantly facing our senses.