Alisa Mandrigin is a post-doctoral researcher on the AHRC Science in Culture Theme Large Grant, Rethinking the Senses: Uniting the Neuroscience and Philosophy of Perception. She is based in the Department of Philosophy at the University of Warwick. This is a blog post she wrote for iCog.

We see, hear, touch, taste and smell. This is what common sense tells us about perception. The view that the nature and breadth of our perceptual experience is accurately captured by those five verbs and that the sensory systems are discrete and isolated from one another has governed much of the philosophical research into perception. Recently, though, researchers in psychology and neuroscience have started to focus on the myriad interactions between the senses and the many other kinds of sensory information available to the nervous system.
How do we know that the sensory systems interact with one another? Some interactions result in effects that feature in everyday experience. For example, if we are presented with a light flash and a beep at different locations but at the same time and then asked to locate the beep, we judge it to be at, or at least close to, the location of the light flash (Bertelson, 1999). Judgements about location are one of the measures of the Ventriloquism effect: the mis-location in perceptual experience of auditory objects or events as a result of seeing something at a different location.
We can measure the effect in the laboratory with visual and auditory stimuli, but in everyday life we are often in situations in which we acquire spatially discrepant information about what is apparently the same source. Ventriloquists make use of the effect in their acts, hence the name given to the effect. At the cinema the film’s soundtrack is played through speakers spread out around the screening room, and not from behind the parts of the screen on which visual images of moving lips, explosions, and so on are presented.
Another visual-auditory interaction gives rise to the McGurk effect. If we see a video clip of lip movements that should produce the phoneme /ga/ with an audio recording of the phoneme /ba/ dubbed over the top, the result is perception of the phoneme /da/ (McGurk & MacDonald, 1976). Again, it seems that processing in the visual system influences processing in the auditory system. We can appreciate this by listening to the same auditory stimulus with our eyes closed: without the visual stimulus there is no effect. You can try it for yourself by viewing this video.

Multisensory interactions are not limited to vision and audition. We have evidence of interactions between all five of the sensory systems, as traditionally conceived. It’s only now, though, that philosophers are beginning to take proper notice of the implications of these discoveries for perceptual experience. This brings us to a question that seems to be important if we are to make sense of these interactions and their consequences for perceptual experience: how do we distinguish the senses? Can we distinguish the senses on the basis of the different kinds of experience produced, or should we distinguish them on the basis of distinct sensory processing systems in the brain, or by means of the nature of the proximal stimulus of the experience, or by something else entirely (Macpherson, 2011)? There’s an analogous question about kinds of experience: how can we distinguish experiences from one another as being, for example, visual or auditory?
Settling on an answer to these questions seems to be necessary if we are to make any headway in classifying interactions as multisensory and deciding whether these interactions result in multimodal perceptual experiences.

For example, our perceptual experience when we eat and drink involves retro-nasal smell—the sensing of odours when we breathe out—as well as taste. When you’ve had a blocked nose you’ve probably noticed this, finding food to be temporarily flavorless and insipid. One response to this has been to claim that we have a distinct kind of flavor experience, resulting from interactions between the olfactory and taste systems (Smith, 2013). This view conceives of the experience as multisensory in so far as it involves processing in two distinct sensory systems, but the experience itself is not taken to be multimodal since it is thought of as being a kind of experience in it’s own right, distinct from either smelling or tasting. The matter is complicated further by evidence that what we see and hear, and tactile sensations within the mouth also contribute to our perceptual experience when we eat (Auvray & Spence, 2008).
Even if we can settle on a way of distinguishing the senses, there are further questions about the kinds of interactions that take place between the sensory systems. One kind of interaction might involve mere modulation of processing in one sensory system by processing in the other. Another kind of interaction might involve the integration of redundant information across the senses. A further kind of interaction might involve the binding together of information about different properties of the same object. For example, when you look at a key that you hold in your hand, visual information about colour might be bound together with tactile information about texture, generating a multisensory representation of the key as smooth and silver (O’Callaghan, 2014). These different kinds of interaction may have different kinds of impact on perceptual experience. What, for example, is the nature of the interaction between vision and audition in ventriloquism and how does it impact perceptual experience?
One approach to ventriloquism explains the effect in terms of the modulation of information in audition by information in vision. Ventriloquism is often measured by subjects’ pointing responses to the auditory stimulus. Subjects point to a position in between the actual locations of the auditory and the visual stimuli. How can we explain this in terms of modulation? We can say that the conflicting visual information about location modifies the auditory information about location (and vice versa). The result is that subjects hear the auditory stimulus as being in between the actual position of the auditory and the visual stimuli. This explanation of ventriloquism is consistent with perceptual experiences remaining modality-specific throughout.
There is, however, an alternative explanation of the mis-localisation of auditory stimuli in ventriloquism. This alternative explains subjects’ pointing behavior in terms of the integration of conflicting spatial information. If sensory information is integrated, it seems possible that this integration will result in a single multimodal experience of an object at a location in space, in this case an audio-visual experience. If there is integration (or binding) of information across the senses, then we need to give some account of how the sensory systems determine that information belongs together.
The issues I’ve mentioned here offer just one avenue that we can pursue in rethinking and revising our views of perceptual experience in light of empirical discoveries about multisensory processing. Another avenue concerns crossmodal correspondences. We reliably match, for example, high pitch sounds with bright lights, high spatial elevations or small objects (Spence, 2011). How, though, are these associations between what seem to be different kinds of properties established? Are pairs or groups of apparently unrelated features of objects, or dimensions of stimuli, encoded in the brain in the same way?
A further line of research concerns synaesthesia. In some cases of synaesthesia an experience in one sensory modality seems to induce an experience in another, non-stimulated sensory modality. For instance, for some synaesthetes, hearing sounds causes them to have colour experiences. Franz Liszt and Olivier Messiaen reportedly experienced colours when they heard particular tones in this way. As with crossmodal correspondences, synaesthetic experience is reliable and robust: hearing particular tones consistently induces experiences of particular colour hues. How do we explain the phenomenon? Do synaesthetes have two distinct modality-specific experiences—an auditory experience and a colour experience, for example—or are their experiences altogether different, experiences of coloured sounds, for instance (Deroy, in press)?
We are just now starting to understand the many and varied interactions that occur across the sensory systems and their impact on perceptual experience. What is clear, though, is that the multisensory nature of perceptual experience is relevant to all us, not just to those who work on the philosophy of perception or in psychology, or to those who work in the arts or in marketing, but to all of us, simply because we are perceivers.

AHRC Science in Culture Blog Post
This is a guest blog post by Dr Keith Wilson, Post-doctoral researcher, ‘Rethinking the Senses: Uniting the Philosophy and Neuroscience of Perception’

The Glasgow Science Festival showcases some the outstanding contributions made by Glasgow and Glasgow-based researchers to science, technology, engineering, maths and medicine. This year, myself and my colleagues on the Rethinking the Senses and Value of Suffering projects, both of which are hosted at the University of Glasgow’s Centre for the Study of Perceptual Experience (CSPE), took the opportunity to exhibit some of the latest interdisciplinary research into the science and philosophy of perception to a public audience at GSF 2015. The result was a lively and varied mixture of activities, talks, and even a film screening, that invited visitors to explore the strange and often surprising world of multisensory perception and illusions.

Whilst it might seem unusual for arts and humanities researchers to exhibit at a science festival, as one of the AHRC Science in Culture theme’s large-grant projects, Rethinking the Senses investigates the nature of perceptual experience from the perspectives of science, philosophy and the arts. In particular, we examine how the current trend towards a ‘multisensory’ approach to sensory perception can help us better understand the way in which we experience and interact with the world. This goes beyond the traditional ‘five senses’ of vision, hearing, touch, taste and smell—a division that was familiar to Aristotle—to consider lesser known senses (e.g. proprioception, balance, pain, thermo- and mechanoreception), as well as interactions between and across the senses.

Interactive Activities
To illustrate this, and building upon the success of our earlier ‘Hidden Senses’ event, our team devised a series of fun and interactive activities that challenged the preconceptions of hundreds of visitors to Glasgow’s Kelvingrove Art Gallery and Museum about what they could taste, hear, taste, see and feel as part of a weekend-long event featuring researchers from across the sciences. Amongst other things, we tested

• visitors’ abilities to identify the ‘taste’ of sweets without using their sense of smell (very difficult since both taste and smell are required to judge flavour)

• whether or not they were ‘supertasters’ (as recently demonstrated on the BBC’s Masterchef programme by AHRC leadership fellow Barry Smith)

• whether they could hear the difference between hot and cold liquids being poured in to a cup (you can!)
along with a series of visual, auditory and tactile illusions including the waterfall illusion, the McGurk effect, and the rubber hand illusion in which sensations of touch seem to be located in a prosthetic dummy hand rather than one’s own hand, which remains concealed behind a partition. Visitors were invited to record their experiences on an Activity Passport, the results of which will be made available via the CSPE website.

Multimodal Illusions
Many of the above cases involve information from one sensory modality, e.g. vision or smell, influencing another, e.g. touch or taste, creating a truly multimodal experience. In the McGurk illusion, for example, a heard phoneme /ba/ fuses with a seen lip movement /da/ to produce the experience of a third illusory phoneme /ga/. Whether this illusory sound is heard, seen, or some combination of the two, is an interesting and difficult question to answer since the resulting experience is partly generated by vision, and so changes when you close your eyes.

Increasingly, scientists and philosophers are realising that such examples are not just isolated curiosities, but illustrate important aspects of how our perceptual and sensory systems combine information from multiple sources into a single conscious experience in which it is no longer apparent which sense, or combination of senses, is operative. Indeed, whether we should think of hearing, vision or taste as discrete senses at all is one of the philosophical questions we hope to shed some light upon during the course of the project.

In addition to the event at Kelvingrove, some of our lead investigators gave talks on the science and philosophy behind our research. These linked in with the Festival’s themes of food, drink, and light, the latter commemorating the 150th anniversary of the groundbreaking discoveries of Scottish physicist James Clark Maxwell. These were extremely well attended, with over 150 signing up for ‘The Perfect Meal’ (Charles Spence) and ‘The Value of Suffering’ (Michael Brady), and over 270 for ‘Vision, Perception and Illusion’ (Fiona Macpherson and Colin Blakemore).

In his talk ‘Vision Impossible!’, RTS principal investigator Colin Blakemore discussed the magnitude of the problem faced by the brain in turning light falling upon the retinas into the rich and coherent visual experience we normally enjoy, while Fiona Macpherson focused upon the philosophical nature of hallucinations and illusions, and in particular the waterfall illusion in which some subjects report the effect of illusory motion without any change in position—a seeming impossibility. Indeed, this being the Science Festival, Professor Macpherson asked members of the audience to record their experiences of the illusion via a questionnaire designed to test the accuracy of this description.

All in all, GSF 2015 offered a wonderful opportunity to present and communicate our research to a wider audience, and we were delighted with the warm and enthusiastic response received. As philosophers and scientists, we spend many hours debating the finer points of multisensory integration and the nature of experience. However, it’s important not to lose touch with the sense of fascination we all feel when encountering a visual illusion for the first time, or of the surprise and wonder on a child’s face upon realising that what we call ‘taste’ is in fact largely derived from the sense of smell.

To that extent, perceptual experience is not only a problem to be studied by scientists and philosophers, but a shared feature of human experience by which we navigate and discover the world. Events like this give us an important opportunity to share what we’ve learned, stimulate people’s imaginations, and in turn build public understanding of, and support for, the arts and humanities as part of the wider search for knowledge that enriches us all.

Keith Wilson
Postdoctoral Researcher, Rethinking the Senses project
Centre for the Study of Perceptual Experience
University of Glasgow

“What colour is that dress?” The question has been fascinating people all around the world, including Prof Barry C Smith of the University of London’s Institute of Philosophy.

We all assume that we see what’s before our eyes, and if we have normal colour vision we should be able to tell what colour the dress is. And yet we’ve just discovered that the world of observers divides into two groups – those who see the dress as white and gold, and those who see it as blue and black, or blue and olive green. They know they are looking at the same image and that it isn’t changing, so why is there such marked disagreement about how the dress looks?

Alisa Mandrigin is a post-doctoral researcher on the AHRC Science in Culture Theme Large Grant, Rethinking the Senses: Uniting the Neuroscience and Philosophy of Perception. She is based in the Department of Philosophy at the University of Warwick. This is a blog post she wrote for iCog. We see, hear, touch, taste and smell. This [&hellip...