Bayesian inference is a statistical method that combines prior knowledge with new evidence to make intelligent guesswork. For example, if you know what a dog looks like and you see a furry animal with four legs, you might use your prior knowledge to guess it’s a dog.
This inherent capability enables people to interpret the environment with extraordinary precision and speed, unlike machines that can be bested by simple CAPTCHA security measures when prompted to identify fire hydrants in a panel of images.
The study’s senior investigator Dr. Reuben Rideaux, from the University of Sydney’s School of Psychology, said, “Despite the conceptual appeal and explanatory power of the Bayesian approach, how the brain calculates probabilities is largely mysterious.”
“Our new study sheds light on this mystery. We discovered that the basic structure and connections within our brain’s visual system are set up in a way that allows it to perform Bayesian inference on the sensory data it receives.
“What makes this finding significant is the confirmation that our brains have an inherent design that allows this advanced form of processing, enabling us to interpret our surroundings more effectively.”
The study’s findings not only confirm existing theories about the brain’s use of Bayesian-like inference but open doors to new research and innovation, where the brain’s natural ability for Bayesian inference can be harnessed for practical applications that benefit society.
“Our research, while primarily focused on visual perception, holds broader implications across the spectrum of neuroscience and psychology,” Dr. Rideaux said.
“By understanding the fundamental mechanisms that the brain uses to process and interpret sensory data, we can pave the way for advancements in fields ranging from artificial intelligence, where mimicking such brain functions can revolutionize machine learning, to clinical neurology, potentially offering new strategies for therapeutic interventions in the future.”
The research team, led by Dr. William Harrison, made the discovery by recording brain activity from volunteers while they passively viewed displays, engineered to elicit specific neural signals related to visual processing. They then devised mathematical models to compare a spectrum of competing hypotheses about how the human brain perceives vision.
William J. Harrison et al, Neural tuning instantiates prior expectations in the human visual system, Nature Communications (2023). DOI: 10.1038/s41467-023-41027-w
Neural model shows evolution wired human brains to act like supercomputers (2023, September 14)
retrieved 14 September 2023
This document is subject to copyright. Apart from any fair dealing for the purpose of private study or research, no
part may be reproduced without the written permission. The content is provided for information purposes only.