Cognitive Sciences Stack Exchange is a question and answer site for practitioners, researchers, and students in cognitive science, psychology, neuroscience, and psychiatry. It's 100% free, no registration required.

I'm wondering why there are different subjective experiences when our sensor systems are technically the same neurons that just get excited by different stuff (photons, soundwaves, ...).

So why does it feel different to hear something than to see something when both are just electrical impulses?

I'm of course aware of the concept of qualia ("redness" etc.), but I haven't found this question being discussed in relation to that.

Edit:

Of course I'm not claiming that the most outer part of the sensor systems work in the same way. However all of them produce (or their function results in) electrical frequencies. I highly doubt that if you'd input the frequencies produced by a retina to the neurons of an auditory system, this would result in vision.

From what I understand and know, all sensory systems output electrical signals. Apart from differing frequencies I would say they(the signals) are the same. At the same time, I doubt that it's just the frequency spectrum in an electrical system that makes a signal "vision" or "sound" to our subjective experience.

technically the same neurons that just get excited by different stuff - I disagree with this... Sounds will excite different neurons, and will get processed differently, than those excited by sight. In other words, yes, they're both 'just' electrical impulses, but they're completely different impulses that originate in different locations. They may activate overlapping areas throughout the course of processing, but they are quantitatively different.
–
BenColeMay 28 '13 at 13:54

With your edit, I'm now a little confused as to your question. You admit that different sensory systems will output different signals, but your question is predicated on the idea that these signals will be processed by 'the same neurons' (when our sensor systems are technically the same neurons). I'm not claiming that the most outer part of the sensor systems work in the same way - are you assuming that the inner sensory processing systems are working the same way as each other?
–
BenColeMay 29 '13 at 13:42

With "the same neurons" I meant like the neocortex etc., not the part that actually picks up the signal. That must be different for every sensor of course.
–
Fabian FritzMay 29 '13 at 16:38

That's the thing, though - there are different parts of the neocortex that process input signals from different sensory areas. After that, we get into the Binding Problem of: 'How, after processing the sensory signals, are those signals aggregated (or bound) together into a coherent experience-process?' - Note that this doesn't assume that we're aggregating the actual signals into a single representative signal. It may be the case that the act of processing the signals is what creates the phenomenon itself - we just don't know.
–
BenColeMay 29 '13 at 16:50

1 Answer
1

Don't know the answer (I think no one does), but you should have a look at this paper: A sensorimotor account of vision and visual consciousness. O'Regan JK, Noë A., BBS 2001 PUBMED

In short the proposed answer is that modalities are subject to different sensory-motor contingencies. For example, when you move forward the visual input undergoes very structured changes, but auditory input undergoes a different set of changes. So what makes seeing feel different than hearing is that the relationships between motor output and sensory input are different.

You might also be interested in sensory substitution experiments. The brain is to some extend able to acquire new 'senses', when specifically structured information is provided to an existing sense. For example, it appears to be possible to 'see' (whatever this means in this context) when brightness information is transduced to touch or electrical stimulation of the tongue Wikipedia