Symbolic and Connectionist approaches to cognition

To understand this, you need a bit of background. Cognitive Science is mostly concerned with creating models of the mind. These models are best tested by using them to build some stab at artificial intelligence. How and if this should happen is the subject of much debate.

One way, (and for a long time it was the only way), of approaching the problem is the symbolic approach. This is closest to what computer programmers are familiar with: treat the mind as an informationprocessor, and it is only a matter of finding out what rules it uses to process that information. Models like this are designed from the top down: start with "thinking" divide that into some more manageable bits, those bits into still more manageable bits, and in theory you will eventually end up with a theory. Problem is that this hasn't worked yet. Deciding how to divide up the bits, and which bits are which, creates a blather of paradoxes and problems. See for instance the homunculus problem. Also, things get ridiculously complex when you get down to the level of neurons.

The bottom line is that neither approach is going to work alone. Using a Symbolic approach to explain why a given neuron fired at a given time is ridiculous. Likewise, describing an "idea" as an emergent property of a vast group of neurons, while it sounds cool, doesn't get us very far in our understanding of what an idea is and how it works. I see both approaches as tunneling into either side of a giant snowbank, like I used to do with my friends when I was a kid. Eventually, with enough work, the two tunnels will meet in the middle. Of course, we have to make sure both tunnels don't pass each other. The hardest part of making this snowfort will be the interface between the top down and the bottom up.

There is another approach I've been thinking about. Consider Everything. No one designed Everything. (well, someone designed the engine). It happened, and continues to happen, in bits and pieces. Perhaps a functional model of the mind, and a functional AI, would come about in a more evolutionary way, building bits and pieces, using old things as scaffolding, new things as food, and everything else as whatever it happens to be.