Auditory and visual scene analysis

Imagine you are walking on a big busy square. Cars are crossing, pedestrians are walking past and towards you, you hear people chatting, a taxi-driver shouting, and you notice a beautiful coloured tree. Our brain is very well equipped to rapidly convert such a mixture of sensory inputs – both visual and auditory – into coherent scenes so as to perceive meaningful objects and guide our navigation. This raises important questions regarding where and how 'scene analysis' is performed in the brain. Recent advances from both auditory and visual research suggest that the brain does not simply process the incoming scene properties. Rather, top-down processes such as attention, expectations, and prior knowledge facilitate scene perception.

This special issue covers novel advances in scene-analysis research obtained using a combination of psychophysics, computational modelling, neuroimaging, and neurophysiology, and presents new empirical and theoretical approaches. Moreover, this issue bridges the gap between sensory modalities by addressing both auditory and visual scene analysis, and includes studies of different species and of individual differences in humans.