Menu mobile menu

How brain rhythms organize our visual perception

A team of neuroscientists from Göttingen and Tehran shows how our brain combines visual features to achieve a unified percept
Cartoon illustration and animation of the study’s findings. Similar to how a radio receiver identifies the radio transmitter from which a signal originates (inset at the right bottom), high level areas of our brain distinguish the source of a neural input activity based on its characteristic frequency. In the drawing of a human brain A and B mark brain areas devoted to analyzing color and motion direction information, respectively and C denotes high level brain areas that combine the information about individual visual features into a unified percept of visual objects. In this example, the color and motion direction of the tracked glider are separately analyzed in areas A and B, and then combined in area C to create our single perception of all the features of the glider. Image: German Primate Center
Prof. Stefan Treue ist Leiter der Abteilung Kognitive Neurowissenschaften am Deutschen Primatenzentrum und Professor an der Universität Göttingen. Foto: Ingo Bulla
Prof. Stefan Treue leads the Cognitive Neurosciences Laboratory at the German Primate Center and is Professor for Biopsychology and Cognitive Neuroscience at the University of Göttingen. Photo: Ingo Bulla
Mohammad Bagher Khamechian ist Wissenschaftler an der Iran University of Science and Technology in Teheran, Iran. Foto: Alireza Memarian
Mohammad Bagher Khamechian is a scientist at the Iran University of Science and Technology in Tehran. Photo: Alireza Memarian
Ein Rhesusaffe in der Primatenhaltung am DPZ. Foto: Margrit Hampe
A rhesus monkey in the primate husbandry at the German Primate Center. Photo: Margrit Hampe

Imagine that you are watching a crowded hang-gliding competition, keeping track of a red and orange glider’s skillful movements. Our brain uses separate circuits to achieve such outstanding tracking ability, one specialized to process color information and the other specialized for processing directions of motion. This allows for optimal perceptual performance, but how do we perceptually combine the color and direction information into our unified percept of the glider, or any other object? A German-Iranian team of scientists now discovered that the brain’s specialized color and motion circuits use different frequencies to broadcast their output to brain areas that combine the various visual feature components into a unified percept (PNAS 2019).

To investigate how information of different visual features is processed in the brain, the neuroscientists from the German Primate Center – Leibniz Institute of Primate Research in Göttingen, Germany, the Iran University of Science and Technology and the Institute for Research in Fundamental Sciences in Tehran, Iran measured the activity of individual nerve cells in the brain of rhesus monkeys, while the animals performed a visual perception task. The monkeys were trained to report changes in moving patterns on a computer screen. Using hair-thin microelectrodes, which are painless for the animals, the researchers measured the electrical activity of groups of nerve cells. These signals continuously oscillate over a broad frequency spectrum.

The scientists recorded the activity in the brain area highly specialized for the processing of visual motion information. Using advanced signal processing techniques, they found that the activity of those nerve cells oscillates at high frequencies (around 200 cycles per second) and that these oscillations are linked to perception. “We observed that faster responses of the animals occurred whenever the nerve cells showed a stronger oscillatory activity at high frequencies, suggesting that these oscillations influence perception and action,” explains Stefan Treue, head of the Cognitive Neuroscience Laboratory at the German Primate Center and one of the senior authors of the study.

Previous studies had shown that different visual aspects, such as the color and motion direction of visual objects, are analyzed in highly specialized, anatomically separate brain areas. These areas then transmit their information to high-level brain areas, where individual features are combined to form our unified percept of visual objects. It turns out that the brain region processing color information transmits information via a lower frequency (around 70 cycles per second) than the high-frequency transmission of the brain region processing motion signals. “Our computational analysis shows that high level regions could use these different frequencies to distinguish the source of neural activity representing the different features,” explains Mohammad Bagher Khamechian, scientist at the Iran University of Science and Technology in Tehran and first author of the study.

The detailed knowledge of how the brain of rhesus monkeys enables perception as well as other complex cognitive functions provides insights about the same processes in the human brain. “The oscillatory activity of neurons plays a critical role for visual perception in humans and other primates,” summarizes Stefan Treue. “Understanding how exactly these activity patterns are controlled and combined, not only helps us to better understand the underlying neural correlates of conscious perception, but also may enable us to gain a better understanding of physiological deficits underlying disorders that involve perceptual errors, such as in schizophrenia and other neurological and neuropsychiatric diseases.”

 

Original publication

Khamechian MB, Kozyrev V, Treue S, Esghaei M, Daliri MR (2019): Routing information flow by separate neural synchrony frequencies allows for functionally labeled lines in higher primate cortex. PNAS, https://doi.org/10.1073/pnas.1819827116