Summary
Our brains are always predicting what we should perceive through our senses, and when our expectations don’t align with reality, it creates distinctive signals known as prediction errors. Previous studies primarily examined visual mismatches, but fresh experiments reveal that the auditory cortex can also generate strong signals when anticipated sounds are missing.
The investigation showed that when mismatches occur in both vision and sound together, the brain’s response is significantly more pronounced than when addressing each mismatch separately. This indicates a complex way the brain integrates sensory errors. Experiments on humans using virtual reality and EEG have confirmed similar processes to those observed in mice, which could be crucial for diagnosing psychiatric disorders.
Key Facts
- Cross-Modal Prediction Errors: Visual and auditory mismatches elicit amplified brain responses.
- Human-Brain Parallels: Initial studies with humans utilizing VR and EEG show comparable prediction error signals.
- Clinical Potential: Unusual prediction error responses may assist in diagnosing and tracking psychiatric conditions.
Our brains are constantly in a state of predicting the sensory input we should receive based on what we’ve experienced in the past and our movements. When reality doesn’t match these expectations, specific neurons respond with a unique prediction error signal.
Research from the Keller group at FMI previously demonstrated that when mice navigate a virtual tunnel and their visual input suddenly halts, a strong prediction error signal is generated. However, it was unclear if this reaction was exclusive to visual stimuli.
To explore this, postdoc Magdalena Solyga designed an experiment where mice ran through a dark corridor, with sound intensity increasing as they traveled faster. Occasionally, the sound would stop, creating a mismatch between what the mouse expected versus what it actually experienced. Neurons in the auditory cortex reacted strongly, suggesting that prediction error signaling is not just confined to vision.
In a subsequent phase, Solyga tested both visual and auditory mismatches simultaneously. In this setup, the visual flow and sound were again linked to the mouse’s speed and would sometimes pause together. This led to a notably stronger brain response, indicating that some neurons specifically responded only to the combined mismatch. This finding implies that the brain processes various sensory errors in a sophisticated, non-linear manner.
Building on these mouse results, the team adapted their approach for human subjects using EEG and virtual reality. In early trials, participants walked in virtual environments, and when the visual scene unexpectedly froze while they continued moving, a clear brain response appeared, similar to what was found in mice. They are now beginning to test for combined mismatches in humans, too.
One long-term aim of this research is to identify reliable brain-based markers for psychiatric conditions. If individuals with psychosis display abnormal responses to mismatches, these signals might help with diagnosis and monitoring treatment—providing an objective measure compared to current self-reported symptoms.
However, applying this research in clinical settings will take time. Capturing brain signals during movement poses technical difficulties, as motion introduces noise in EEG data. To date, the team has tested 17 healthy adults and aims to include 50 participants to gather more comprehensive results. Variables like hairstyle and movement artifacts can influence signal quality, making a larger participant pool crucial for drawing broader conclusions.
This study also opens up new scientific inquiries. “We still don’t understand how the enhancement of brain response to prediction errors occurs: does it result from direct communication between sensory areas, or is there another brain region that processes all these mismatches?” Solyga pondered. “There are so many exciting avenues to explore.”
About this neuroscience research news
Original Research: Open access. “Multimodal mismatch responses in mouse auditory cortex” by Magdalena Solyga et al. eLife





