Citation

Abstract

We are constantly bombarded by sensory input, from multiple sources. How does the human brain process all this information? How does it decide what information to combine and what to keep segregated? If a visual and auditory stimulus originates from the same cause, say a baseball hitting a bat, there are advantages to combining the information to a single percept.

The Bayesian framework is an obvious way to approach this problem. Given sensory input, the optimal observer would try to infer what sources created the stimuli.

We developed a model based on Bayesian principles that makes no assumptions about the type of correlations between sources in the world, and found it in excellent correspondence with experiments on human subjects.

Further, it is possible to place this problem in the more specific frame of causal inference, which has previously only been applied to cognitive problems. Causal inference tries to infer the hidden causal structure from the observables, implying that we constantly are inferring the causal structure of the world surrounding us. This can be thought of as a more constrained approach than our previous model with fewer parameters.

We performed a number of psychophysics experiments and found the subjects’ responses to be in excellent accordance with the model’s predictions. Further predictions from the model, such as independence of the likelihoods and priors, were tested and found to be in accordance with data.

These studies show that human perception is in accordance with Bayesian inference and implies a commonality between perceptual and cognitive processing.