Summary: Recent studies indicate that neurons in the visual cortex are much more flexible than previously understood, showing a dynamic response to complex stimuli when recognizing objects. Traditionally, visual processing has been seen largely as a straightforward feedforward mechanism, but this research suggests that feedback from higher brain regions utilizes prior knowledge and contextual information to shape perception.
This “top-down” input enables early visual neurons to adjust their responsiveness in real-time, depending on the task. These discoveries challenge longstanding models of visual processing and could have significant implications for understanding perception, as well as conditions like autism.
Key Facts:
- Dynamic Neurons: Early visual neurons can quickly adjust based on task requirements and past experiences.
- Top-Down Feedback: Higher brain areas provide contextual data to lower visual regions, guiding perception.
- Implications for Autism: Insights into feedback mechanisms might illuminate perceptual variations in autism.
Source: Rockefeller University
From the moment we open our eyes, our brains start crafting internal representations of our surroundings. Neurons in the visual cortex help us piece together scenes into recognizable objects.
This process follows the ventral visual pathway, which runs from the primary visual cortex at the back of the brain to the temporal lobes.
Historically, it was believed that specific neurons in this pathway were designated to handle certain types of information based on their location, with the primary flow of visual data being feedforward, progressing through different visual cortical areas.
Nonetheless, the existence of feedback connections has been recognized, though their functional significance remained largely unexplored.
Research from Charles D. Gilbert’s lab at Rockefeller University is uncovering the crucial role of feedback in visual processing.
In a recent paper published in PNAS, his team shows that this reverse information flow, known as “top-down” feedback, is informed by our previous interactions with objects.
This means that neurons in the visual pathway are not rigid in their responses; instead, they adapt moment by moment to the information they receive.
“Even in the early stages of object perception, the neurons are attuned to much more complex stimuli than we had thought, largely due to feedback from higher brain areas,” says Gilbert, the head of the Laboratory of Neurobiology.
A different understanding
Gilbert’s lab has investigated how information representation occurs in the brain for years, focusing on the circuitry related to visual perception and learning.
“Traditional views suggest that neurons at the beginning of the pathway only detect simple data, like a line segment, and that complexity increases further along the hierarchy, culminating in neurons that respond to more intricate features,” he explains.
Yet prior research from his lab suggests this perspective may not hold up. They found evidence that the visual cortex can modify its functionality and circuitry, a trait known as plasticity.
Working alongside Nobel laureate Torsten N. Wiesel, Gilbert also identified long-range connections in the cortical circuits that allow neurons to link information across a broader visual field than previously understood.
Additionally, he documented that neurons can shift between task-relevant and irrelevant inputs, highlighting their functional adaptability.
“In this study, we aimed to show that these abilities are a natural part of how we recognize objects,” he comments.
Seeing means understanding
Gilbert’s lab has dedicated time to studying macaques trained in object recognition using a range of items—some familiar, others not—like fruits, veggies, tools, and machines.
As the animals learned to identify these objects, the researchers tracked their brain activity using fMRI to pinpoint which regions reacted to visual stimuli. This approach was developed by Gilbert’s colleague, Winrich Freiwald, who has used it to locate areas sensitive to facial recognition.
Electrode arrays were then implanted to record individual neuron activity as the monkeys viewed the objects they’d been trained on.
Sometimes the monkeys saw the entire object, at other times just parts of it, or cropped images. They were then shown various stimuli and asked if they recognized any as matching the original object.
“These are called delayed match-to-sample tasks,” Gilbert notes. “There’s a pause between seeing an initial object and being presented with a second one, which they need to identify as a match.”
While scanning through various visual options, they had to actively recall the initial image.
Adaptive responses
Research findings revealed that individual neurons could respond more to one visual target at a moment and switch to being more responsive to another target with a different cue.
“We’ve realized that these neurons work as adaptive processors, changing their functions based on immediate behavioral needs,” Gilbert adds.
They also found that neurons at the start of the pathway, once thought to be limited in their responses, were actually much more versatile.
“They’re sensitive to more complex stimuli than we once believed,” he clarifies.
“There doesn’t seem to be a vast difference in the complexity processed in early versus higher cortical regions as was previously assumed.”
These insights support what Gilbert considers a new viewpoint on how the cortex processes information: that neurons don’t have fixed roles. Instead, they tune their functions dynamically based on sensory experiences.
Observation of cortical activity indicated a possible role for feedback in object recognition, with higher brain areas facilitating the dynamic abilities of lower areas.
“We found that these ‘top-down’ connections bring in information about object nature and identity, gathered through experience and context,” he explains.
“It’s like higher-level areas are instructing the lower ones on how to interpret the data, and the feedback signal from the lower areas is their response.”
“These interactions likely continue as we recognize objects and make sense of our visual environment,” he concludes.
Potential for autism research
The results emphasize the rising acknowledgment of feedback flow’s importance in the visual cortex and, possibly, beyond.
“I believe top-down interactions are crucial for all brain functions, including sensory processing, motor control, and higher cognitive tasks, so understanding these interactions’ cellular and circuit basis could deepen our grasp of brain disorders,” Gilbert states.
With this in mind, his lab is starting to explore animal models of autism at both behavioral and imaging levels. Research specialist Will Snyder will examine perceptual differences between autism-model mice and typical counterparts.
Concurrently, the lab will analyze large populations of neurons in their brains while they perform natural behaviors using advanced neuroimaging tools at the Elizabeth R. Miller Brain Observatory on Rockefeller’s campus.
“Our aim is to spot any perceptual distinctions between these groups and understand how cortical circuits might explain these differences,” Gilbert notes.
About this visual neuroscience research news
Author: Katherine Fenz
Source: Rockefeller University
Contact: Katherine Fenz – Rockefeller University
Image: Credit: Neuroscience News
Original Research: Closed access.
“Expectation-dependent stimulus selectivity in the ventral visual cortical pathway” by Charles D. Gilbert et al. PNAS





