Like this:

I was earlier today reading Bertrand Russell‘s “The Problems of Philosophy,” and it got me thinking about why firsthand experience can be so compelling, in the case of people who see ghosts, think they have been abducted by aliens, or make causal connections between an alleged cure for an illness and the apparent recovery from said illness.

But undisciplined first hand experience can fool us if we know not our limits. We can fool ourselves into believing absolutely things that are simply not true when rigorously examined.

But even though sometimes misleading, experience is normally quite reliable. The fallacy is in thinking it more so than it really is, in thinking it infallible.

First, without a good background knowledge of how our senses and the sensory processing of our brains, and our thinking about those perceptions, can go astray, we can easily confuse the thing perceived for the thing itself, that what we see is what is, as it is, an idea known as naive realism.

There’s the possibility that we could be hallucinating, dreaming, or at least witnessing a sensory illusion based on a misperception of something real.

This can seem compelling, since even though it’s conceivable to doubt the reality of the computer monitor in front of me as I sit at my desk, what is indisputable to me is the fact that the seeming of a computer monitor and the images on it, and the visual, auditory, and tactile sensations of other objects I note in my room, are being perceived by someone or something.

And they appear relatively constant, not randomly moving about unless some agency picks them up or knocks them over.

Partly because of this quality of realism, personal experience, no matter what’s going on in the mind of the one experiencing, has a quality of seeming certainty, an absolutely compelling quality, at times a stark vividness, that just demands assent, and this is the meaning behind the phrase, “seeing is believing.”

If you don’t have reason to think otherwise, that is.

Whether the one doing the perceiving is a literal self, the “I” in “I think, therefore I am,” perhaps the mind of a dreaming god, or a bank of intelligent computers on Mars running a simulation of the mind of a hypothetical human being on Earth sitting before a computer, viewing the monitor, and typing this post into a browser window, the identity of the one doing the perceiving is open to the possibility, however remote, of doubt.

But to be useful as a path to knowledge, doubt should be rational, and not all that is conceivably dubious is rationally so.

I find it much simpler a way of explaining my perceptions, those that seem to have any evident correspondence with something external to myself after cross-checking with other senses, or independent verification by the testimony of another, as being real because of the simple fact that most of the time, I’m given no adequate reason to doubt them.

I can’t prove indisputably that I’m not a brain in a jar, or that I’m not living in the Matrix, or that I’m not the only person in the universe creating everything I see with my thoughts, but I’ve no reason at all to believe any of these things as a result of my accumulated experience and personal knowledge base, nothing irregular in most of my perceptions that serves as an obvious red flag, as is often the case upon waking from a dream, of a feeling of anything amiss that says, “D00d! Wake up! This ain’t real!”

Not even the slightest reason, save the occasional spurious perception that I’m well-acquainted with and recognize for what it is rather easily.

Even when lacking the ability to absolutely prove the world is real, and not the dream of myself, or that of a titan or sleeping god, or a Martian computer network, The notion of a real world with things in it that exist apart from my awareness of them, with characteristics that nonetheless imprint themselves rather consistently upon those perceptions over time, such as the continued existence of this computer, or my cats, when I turn my gaze from them or leave the room and return to find them much the same, is just such a useful and elegant hypothesis, that it seems foolish to deny it merely because such denial is possible.

Share this:

Like this:

One of the most important lessons learned by me as a skeptic is the willingness to make a simple admission of human fallibility and fallacies, that our capacity for perception, reason and insight are limited. It is to recognize that what these and our memories tell us, while usually reliable, can often lead us into error and deception. It is to acknowledge the following recognition:

“I can be fooled, just like anyone else.”

This is one of the key differences between skeptics and many believers, the latter of which are often convinced that they cannot be fooled, for whatever reason, and in so being set themselves up to be just that.

The inability to admit one’s capacity for being fooled is a common weakness for many parapsychologists and was one of the major reasons for the infamous gullibility of Sir Arthur Conan Doyle and the credulity of many of the investigators of the paranormal from the 19th century onward.

Why can we all be fooled? Because our brains, working the way they do, not always the way they should, sometimes play tricks on us, and we may perceive, believe, or remember things that simply are not so, and often never were. Human reasoning, perception, and memory are not faithful transcribers of everything we experience.

Our perceptions and memories are constructive.

When we remember something, we are not looking at an exact recording of an event in our heads, accurate in all details — we are each time recreating the memory anew, often emphasizing or even confabulating some details at the expense of others, embellishing our recollection at the expense of the truth. This can result through suggestibility in the creation of entire memories of events that did not happen, though our confidence in their accuracy may be absolute.

One’s conviction of the truth of one’s memories is no guarantee of their accuracy, even with so-called “flash-bulb” memories.

Our senses can deceive us through many routes, particularly our prior beliefs and expectations, causing us to see what is simply not there. You do not have to have a psychiatric diagnosis or to be under the influence of recreational pharmaceuticals to hallucinate. Rene Blondlot’s N-rays, and likely many alleged miraculous occurrences and UFO sightings ‘witnessed’ by large numbers of people at once are prime examples of this.

Believing is seeing, and yes, collective hallucinations by those without a mental illness do happen, and are more common than you might think. We also sometimes do see things, but not as they really are, when we look at or hear one thing but our brains are trying to tell us something else, hence, optical or auditory illusions, including pareidolia.

Even our introspective ability, when coupled with the conviction that one is consistently honest with oneself and therefore immune to being fooled from that route causes the one concerned to lower his or her proverbial guard, and in so doing, be much more readily deceived.

Thus, even close scrutiny of one’s own thoughts, feelings and motives, while often reliable, is not infallible either.

Our ways of acquiring, storing, and recalling information have very real limits, as our evolution as a species has given us ways of knowing and interacting with the world that are merely adequate, not optimal from any competent design standpoint.

Any human engineer who would knowingly design our minds and bodies the way they are now would have been executed for failure upon first submitting his schematics.

Due to the aforementioned limits of perception and memory, and the fallibility of even usually dependable cognitive rules of thumb — heuristics, here, here, here, here, and other hidden persuaders — personal experience, while a compelling form of evidence for many, can be very deceptive.

No one is immune to deception, especially by oneself, and no form of deception is more effective, more pervasive, and more insidious than self-deception.