Category Archives: Critical Thinking – Basics

Critical thinking is “careful, deliberate determination of whether one should accept, reject, or suspend judgment about a claim and the degree of confidence with which one accepts or rejects it.” (Parker & Moore, Critical Thinking)

The problem is that much of our thinking is biased, distorted, partial, uninformed or prejudiced. Yet the quality of our lives depends on the quality of our thoughts. Bad thinking costs us time, money, and possibly our lives. Good thinking may be profitable and save our time and lives. But good thinking is hard and takes practice.

Believable premises – This assumes we have some well-informed background beliefs about the world so as to determine whether a premise is believable. No relevant info passed over – We need to avoid the temptation to disregard contrary evidence. Valid reasoning – When the premises support the conclusion, or, to put it another way, the conclusion follows from the premises, the reasoning is valid. When the premises are also true, then we have a sound argument.

Some wrong ideas about cogent reasoning – Good reasoning is not relative to people, cultures, religions, etc. (There is no male or female, black or white logic.) When you violate deductive reasoning you contradict yourself; and when you violate inductive reasoning you deny evidence and experience. The way the world works is not relative to people, cultures, religions, etc. Still, self-interest, prejudice, or narrow-mindedness leads people to reason poorly.

Background Beliefs – Background beliefs are crucial to determining whether premises are believable and whether no relevant info has been omitted. “That is why bringing one’s background beliefs to bear often is the most important task in evaluating an argument for cogency… ignorance is not bliss. It just renders us incapable of intelligently evaluating claims, premises, arguments, and other sorts of rhetoric we all are subject to every day.”

Kinds of Background Beliefs – We have beliefs about both facts [whether the St. Louis Cardinals won the 1964 baseball World Series] and values [whether it is a good thing that people play baseball.] Beliefs can also be true or false. We need to constantly examine our background beliefs to weed out false ones. Education [as opposed to indoctrination] helps us acquire true beliefs and rid us of false ones. Beliefs also differ in how firmly they should be believed. “The trick is to believe firmly what should be believed, given the evidence, and believe less firmly, or not at all, what is less well supported by the evidence.”

Worldviews or Philosophies – Children tend to believe what they are told, thus most of us believe, even as adults, what we were told as children. [For example, an almost perfect predictor of a person’s religious beliefs are the beliefs of their parents.] These basic beliefs we might call our worldviews or philosophies. “They tend to be the most deeply ingrained and most resistant to amendment of all our background beliefs.” We work very hard to keep them [so as not to create cognitive dissonance.] It is crucial that our worldviews, if they are to consist of true background beliefs, “contain at least a few modestly well-founded beliefs about important scientific theories.”

Insufficiently Grounded Beliefs – Most people have strongly held beliefs about things about which they know almost nothing. In order to think well then, we must weed out poorly grounded [false] beliefs. It is crucial—if we are to think well—that we have well-founded [true] beliefs to support our worldview since “…worldviews are like lenses that cause us to see the world in a particular way or filters through which we process all new ideas and information. Reasoning based on a grossly inaccurate or shallow worldview tends to yield grossly inaccurate, inappropriate, or self-defeating conclusions…”

Two Vital Kinds of Background Beliefs – Beliefs about human nature, and beliefs about the reliability of information sources.

Science to the rescue –

the most accurate information comes from the well-established sciences of physics, chemistry, biology, … the scientific enterprise is an organized, ongoing, worldwide activity that builds and corrects from generation to generation…Absolutely no one, starting from scratch, could hope to obtain in one lifetime anything remotely resembling the sophisticated and accurate conclusions of any of the sciences …

Summary of critical thinking – Critical thinking is a higher order of thinking as opposed to lower order thinking. Lower order thinking is 1) unreflective, 2) relies on gut intuition, and 3) is largely self-serving. Higher order thinking is 1) reflective, 2) uses logic and reason to analyze and assess ideas, and 3) is consistently fair.

More specifically critical thinking overcomes the most common tendencies of poor thinking: egocentric and sociocentric thinking.

Sociocentric thinking refers to the extent persons internalize the prejudices of their society/culture. Such persons: a) uncritically accept that their culture is best; b) internalize group norms without questioning; c) blindly conform to group restrictions; d) ignore the insights of other cultures; e) fail to realize that mass media shapes the news from the point of view of their culture; f) ignore their culture’s history, etc.

In contrast to unreflective thinking, critical thinking is fair-minded and open-minded—to think critically is to reason well. It is the kind of reasoning that is the essential ingredient in solving life’s problems. I have written elsewhere in this blog about good thinking, especially in my recent column “We Fear Thought.” But I would summarize my thoughts on the topic, as I did for generations of university students, by saying—good thinking is an essential ingredient in living well.

Begging the Question – Begging the question is assuming (usually in the form of a premise) the conclusion we intend to prove. Here are some examples: “Freedom is good for society because it is conducive to the good of the community.” “Chloroform renders people unconscious because it’s soporific.” “The reason that there is a big demand for a Harvard education is because everyone wants to get into the school.” Or consider this argument:

Abortion is unjustified killing;

Unjustified killing is murder;

Thus, abortion is murder.

Abortion may be murder, but this argument doesn’t show it because it begs the question—it assumes what it’s trying to prove because in the above argument unjustified killing is just another name for murder. Or try this:

The Bible says that Yahweh is the one true god.

The Bible cannot be mistaken because it is the word of Yahweh.

Thus, Yahweh is the one true god.

Yahweh may be the one true god, but this argument doesn’t show it because it begs the question.

Coincidence – Coincidence, regarding events, refers to the appearance of a meaningful connection when there is none. Humans often see patterns where there are just random fluctuations—just listen to post-game sports analysis. And people often assume that what follows from something caused it—that correlation equals causation. (I wore blue jeans and then it rained, thus my jeans caused the rain.) This is called “post hoc propter ergo hoc.”

The only good way to test the causal connection between, for example, taking a drug and getting better is a scientifically controlled experiment. With two similar groups, we test how quickly persons recover with (test group) and without (control group) the drug. Unless carefully conducted experiments show something works, there is absolutely no reason believe it does.

In fact, our very existence is coincidental, but most of us think we result from a cosmic plan. We might even conclude that gods exist because the likelihood of human life was so improbable. But this is like saying that all lotteries are fixed. Yes, it is extremely unlikely that anyone wins the lottery, but that doesn’t mean it’s fixed when someone wins. It’s extremely unlikely you make four aces playing poker, but that doesn’t mean you cheated when you get them.

Humans are well-known to have cognitive biases. The list on Wikipedia shows nearly 100 well-known and named cognitive biases. For instance, Thomas Gilovich found that most people thought that the sequence, “OXXXOXXXOXXOOOXOOXXOO” looked non-random, when, in fact, it has several characteristics maximally probable for a “random” stream.

Statistics – Something may be statistically true, but the conclusion you draw from those stats is debatable. Do cancer rates go up because of air pollution, chemicals in food, people living longer, some combination of the above, some combination of the above and something else, or something else altogether? Only scientific experiments can sort this out. Moreover, the statistics you hear are often mistaken. For example, you may have heard that people only use 10% of their brains, yet this is false. Consider the following:

That 35% of British children live in poverty vastly overstates the case since most of what they need—education, health care, and housing—is provided to all.

Even if the stats accurately report what people say they do, you can’t be sure they would actually do what they say they would do.

You need to know the source of the stats so as to avoid sample bias. If we ask members of the US Table Tennis Association how many of them enjoy table tennis, we would probably get a figure close to 100%. If we asked starving children in Africa the same question, we would probably get a figure close to 0%.

Stats are often just plain wrong and nobody bothers to check them.

Morality Fever – Moral fervor isn’t a refutation of a position.

What’s Wicked is False – Just because it’s bad to believe something doesn’t make that belief false. It may be bad to not believe in the gods, but the gods may still not exist.

What’s Beneficial is True – Just because it’s beneficial to believe something doesn’t make the belief true. It may be good to believe in the gods, but the gods may still not exist.

The Meek Shall Inherit the Earth – Just because someone is a victim of injustice doesn’t mean their opinions are correct. And just because you feel guilty about something you did doesn’t mean the victims of your actions are virtuous.

Conclusion – “If the matter at hand is something you genuinely care about, then you should seek more than ever to believe the truth about it. And rationality is merely that way of thinking that gives your beliefs the greatest chance of being true.”(156)

Empty Words – Language is often empty, vague and obscure. Still precise terminology (or jargon) is sometimes needed for clarity and precision, as in the sciences. But other times jargon disguises simple ideas under a barrage of verbiage, often to sound impressive.

One way language misleads is with weasel words—words that appear to make little or no change to the content of a statement, but actually drain all or most of the content from the statement. Typical weasel words are may, can, could, might, might, arguably, etc. Other devices for deception are hooray words—justice, life, freedom—and boo words—murder, taxes, Hitler. Politicians love to use such words because then listeners believes the politician shares their concerns. Also the use of quotation marks—to show that what some word means is only alleged—leaves you unsure of the author’s meaning.

Deceptive Language – Language is often used to persuade and confuse people. To see this consider that words have cognitive meaning and emotive meaning. For example the terms bureaucrat, government official, and public servant may not be that different cognitively, but they elicit different emotional reactions. Con artists, advertisers, and politicians manipulate our desires and beliefs by appealing to these emotions.

Recent examples are endless. Why was the name of the US “War Department” changed to “Department of Defense? Do you think that was accidental? If you want to get rid of the “Clean Air Act,” don’t call it the “Dirty Air Act,” call it the “Clear Skies Initiative.” If you want to get rid of health care, don’t call it The Affordable Care Act, call it Obamacare. And don’t say torture, say enhanced interrogation; don’t say insanity, say battle fatigue; don’t say we attacked first, say preemptive action; don’t say occupation forces, say coalition forces; don’t say terrorists, say freedom fighters, don’t say freedom fighters, say terrorists (this is not a typo); don’t say war, say “Operation Desert Shield” or “Operation Iraqi Freedom” or “Operation Awesome!” And in addition to political and military doublespeak, there is legal, bureaucratic, and governmental, and doublespeak—language that deliberately obscures. (To understand this better read George Orwell.)

Inconsistency – I can’t say “every adult in France drinks wine” and then say “every adult in France doesn’t drink wine” without contradicting myself. Both statements can’t be true. Contradictions may be easy to spot if you state them explicitly like this, but often the inconsistency is harder to spot. Suppose I make the following argument:

Everything denounced in the Bible should be illegal

Abortion is denounced in the Bible, thus

Abortion should be illegal

To be consistent, I must denounce and praise everything the Bible denounces and praises. Independent of the fact that there is no clear Biblical prohibition against abortion, if one consistently followed the Bible on moral matters one would have to condemn, often under penalty of death: working on the sabbath, eating shellfish, approaching an altar with poor eyesight, getting haircuts, touch the skin of dead pigs, planting two different crops in the same field, contacting women during menstruation, cursing, rebelling against parents, and more. Thus, to be consistent, you can’t pick and choose to suit your prejudice.

Equivocation – In logic an expression is used equivocally in an argument when it has two different meanings—it is used in one place one way and another way in another place. For example, if I say that clubs don’t hurt because I joined one and I’m fine, whereas you say you were hit by one and they do hurt, then we are equivocating on the use of the word club.

Similarly, the words Mormon or Republican or Marxist have many different meanings. For example, suppose I say that being a Mormon makes you a moral person. Suppose that you respond that Mormons killed a number of people traveling through Utah in the late 19th century. I might then say “but those weren’t real Mormons!” The problem here might be that we are equivocating on the term Mormon; we are using the term differently.

One of us might be referring to the acceptance of Mormon doctrines—Joseph Smith was led by an angel to dig up and interpret gold plates with the use of a magic hat, etc., whereas the other might mean “not being murderers.” To defend my claim that the killers weren’t real Mormons, I would have to show that being Mormon isn’t just accepting the stories in the book of Mormon, it also involves not murdering. But then I have changed the definition of Mormon. Now it means accepting the story of Smith and not murdering. Of course on this definition, all it means to say that being a Mormon leads you do good things is to say that being a Mormon leads you to do good things. Needless to say the statement has now been emptied of its significance.

In the above case the definition of Mormon has been changed, and emptied of all meaning. If you do this continually, you can never be refuted. For example, if you were a government who wanted to torture people you could simply change the definition of torture to mean something you don’t do. If government critics say “you do torture by the standards set out in the Geneva Convention” then you could say “we don’t torture,” because by torture we now mean “by our standards which are worse than those conventions.” (Unfortunately such equivocation has awful real world consequences.)

Equivocation is used to deceive people, to make them draw unjustified conclusions. We could use any word—wealthy, criminal, democratic, free, great—to describe a person or a country and mean many different things.

Prejudice Disguised As Logic – If you don’t have good reasons for your beliefs, you can give them up, or stick with them because you’re like them. You could also claim that your beliefs transcend thoughts and words, that you know they are true “in your gut.” But this is just prejudice dressed up, you still haven’t given any reason to support your beliefs. There are a number of “ploys by which the prejudiced attempt to substitute sanctimony or other grand irrelevancies for evidence.” (32) He considers each in turn.

Mystery – The fact that I find auto mechanics mysterious tells you nothing about auto mechanics and something about me. It tells you I’m ignorant of auto mechanics, not that auto mechanics is mysterious. Yet people often draw this conclusion about important matters. They believe outlandish things by saying that they are mysteries. For example, I might defend my belief that Santa Claus flies around on his sled by saying that it’s just a mystery how he does this. The idea of mystery allows you to believe whatever you want however silly.

Furthermore it’s dishonest to believe what is obviously false. There is nothing mysterious about saying the Green Bay Packers both won the first super bowl and the Packers lost the first super bowl—instead it’s something that can’t be true. Of course there are genuine mysteries. Some things are mysteries to me because I’m ignorant of them, and some things are mysterious to even the experts. Mystery, in large part, is a measure of the current state of our ignorance. To say something is a real mystery tells us nothing about it, only that we don’t understand it. The proper reaction to a mystery is to study the issue further or forget it, but not to say that we are justified in believing whatever we want.

Faith – “Rather than trying to obscure your prejudice, boldly declare it a virtue. You have no reason to believe what you do, no evidence, no argument. Of course not. This is a matter of faith!”(36) Faith doesn’t provide evidence, it merely shows that you have none. And if you say that faith is necessary when knowledge is lacking, you are claiming that all opinions are equally valid and mere prejudice. When one declares they have faith in some proposition, you can be sure they cannot defend the proposition any other way.

Odds – Pascal argued that believing in a god is a good bet. Bet for the proposition and you either win big or lose small; bet against the proposition and you can either lose big or win small. Hence it’s in your interest to bet that a god exist. There are a number of problems with this line of reasoning. First, the gods might see through your wager and not reward you for playing the odds. Second, even if this were a good bet, this says nothing about the truth of the proposition. Third, the bet doesn’t tell you which god to bet on. Should you bet on Jehovah or Allah or some other god? And, since there are an infinite number of gods to bet on, the chance you bet on the right one is small. In addition, reality might be set up different from Pascal imagines. Maybe the gods reward skeptics and non-believers for thinking for themselves and punish believers for accepting superstitions.

Weird Science – Maybe science is a conspiracy to destroy culture and impose its truths on us. But even if that’s true, which it isn’t, your non-scientific ideas are still probably false. Sometimes when people speculate, they like to appear scientific and discuss things like quantum mechanics to impress others. Many students over the years have told me that since quantum mechanics is weird, other weird things might be true. That may be trivially true, but most weird things we believe are false. “Anyone who thinks that her favorite weird ideas—about reincarnation, astral travel, or whatever—are intellectual bedfellows with quantum mechanics ought to read some of the latter.”(44)

But Still – You can appear reasonable by admitting evidence for a view you don’t hold and saying “but still…” For example, you might have said before the start of the American Civil War that “although the north has more people railroads, industry, ships, and artillery than the south, still I think the south will win. Or you could say: “I know what you’ve told me about the overwhelming evidence for evolutionary biology, but still it just doesn’t feel true.” In both cases, your “but still” is giving you good reasons to reject your current view.

It’s Obvious – If people says something goes without saying or is obvious, then it’s probably not. The big problem with sloppy thinking—thinking that doesn’t care what the reasons and evidence are—is that it can support anything. If I believe that little green dogs rule the world, then you won’t likely change my mind if I’m committed to that belief, even though you now know that I’m crazy.

Don’t Go There – Another clue that someone has little or no evidence on their side is when they engage in moral positioning when their beliefs are challenged. “My position is very important to me, don’t go there.” People with evidence rarely get so upset.

Bad People – It doesn’t refute someone’s argument to say they’re bad persons. It doesn’t refute a position to say someone isn’t allowed to express it. And it doesn’t refute an argument to say that you’ve never heard it before—truth is often boring and fiction often exciting. (Which is why people believe so many weird things.) The best refutations show how beliefs are inconsistent with well-known truths. That’s why one shouldn’t believe in telekinesis because, if true, it would mean that many well-known principles of physics are mistaken. Now consider the claim that truth is relative to culture. This is contradicted by the fact that cultures believe things which we know aren’t true.

In short, arguments stand and fall on the strength of the reasons and evidence offered for them. I may be a bad man, but my argument for the truth of relativity, quantum, atomic, or evolutionary theory may be irrefutable. Similarly, I may be a wonderful person, but that doesn’t mean my beliefs about physics, chemistry, biology or anything else are any good.

Share this:

Yesterday’s post discussed the beginning of a book I used some years ago while teaching a college class: Crimes Against Logic: …. Here are some of the issues raised in the book with interspersed comments.

Motives – The motive fallacy refers to the notion that exposing someone’s motives for expressing an opinion tells us whether the opinion is true or false. Suppose I tell you that my church is the only true one because I want you to contribute money to it. That is my motive. Still my church might still be the one true one. Money may motivate lawyers, but that doesn’t tell you whether their clients are innocent or guilty. Only the evidence, with varying degrees of certainty, can tell you what’s true.

Yet motives are relevant when dealing with someone’s testimony. If I know that you enjoy making up fantastic claims, then I have good reason to doubt your story of having been abducted by aliens. Or if I know that your motivation is to make money—say you are an oil company—then I have reason to be suspicious of your claim that climate change isn’t happening.

Politics has become so rife with consideration of self-interested motives that the benefits or harms of policies are often secondary. If a politician says “we must worry about voter fraud. That is what these illegal aliens and poor people do. They go to polling places using fake IDs.” I certainly have reason to be skeptical of this claim, since voter fraud is virtually non-existent and voter-suppression omnipresent. Real “voter fraud” is stealing or buying elections. So motives are relevant.

Authority – “Because I say so.” If you heard this as a child, it might have been your parents way of saying, “stop asking questions!” But suppose you ask a factual question: “Were some people in the past born to virgins?” or “Who won the 1967 baseball World Series? The answer, because I said so, doesn’t work here. And that’s because while parental authorities may determine your bedtime, with or without a reason, they don’t have the authority to determine whether someone was conceived without sex or who won the 1967 World Series. Of course on many topics the experts are more likely to be correct than non-experts. In physics or biology, which are objective and precise, appealing to authorities makes sense. If a non-experts says: “no way time is relative to motion or evolution is just a theory,” we should reply, “you need to go the national academy of sciences website where you will unanimous scientific support for both of those ideas.”

People – So the fallacy of authority occurs when you confuse people who have power or unusual influence with people who are experts. You may love your parents or some celebrities, but that doesn’t mean they are experts on the history of baseball or scientific theories. Of course authorities may threaten you if you don’t believe them, and this may motivate you to believe their claims, but that doesn’t provide evidence that something is true. “For those interested in believing the truth the unsupported opinions of the ill-informed are of no help and are not improved by being offered up at gunpoint.” (21)

In the past authority figures held more sway, making it harder for people to understand the authority fallacy. Now the idea of “the people” as authority has become popular. They are constantly cited in support of various views, but the people are typically not experts on almost anything. Consider that most Americans are scientifically illiterate, but that doesn’t mean that science is false. Public opinion may decide which policy is adopted, but it doesn’t decide which policy is better—only the facts do that. The public may decide that they want to continue to have millions of people go without health care, but whether that is morally or economically better than having everyone covered—as is typical of most western democracies—is independent of the people’s opinion. The people often choose the wrong policy, and are more likely too if they lack critical thinking skills.

Opinions – Facts do not depend on opinions. The claim that Jupiter is a gaseous planet isn’t true because many people believe or don’t believe it. It’s true independent of people’s opinions. “No fact can be made just by being believed.”(24) Of course some things are a matter of taste—carrots taste good to one person and bad to another. But don’t confuse this with thinking that everything is relative. Who won the 1967 World Series or the first Super Bowl, or whether Apollo lives on Mt. Olympus, don’t depend on what you think.

Victims – Since we fear being undemocratic, disagreeing with the general consensus “is not merely bad luck for a politician who would like to be elected; it is looked upon as some kind of moral failing.” So we are expected to consult victims of crime, for instance, as if they are experts on social policies. But they are not experts, and their personal experience may cloud their judgment about what is the best public policy. It is astonishing how person’s ignorant of all sorts of things are asked their opinions, and often on national television! Who cares what a plumber thinks about monetary policy or a moose hunter thinks about geopolitics or a philosopher thinks about quantum physics? This is not to say these may not be nice people, and they might know a lot about plumbing, hunting, and philosophy, but that doesn’t make them experts on other topics.

Celebrities – Actors, musicians, celebrities and the like aren’t usually experts about evolutionary biology or whether you should drink coca-cola. Isn’t this obvious? I suppose not, since celebrities are used to sell all sorts of things. To spot the authority fallacy, try to determine if someone is really an authority, and whether there are authorities on the subject matter. There really are experts in evolutionary biology and quantum physics, but there are not authorities on whether carrots or coca-cola taste good. Some subject matters are precise, like the natural sciences, and you can feel confident about what the experts tell you. Others, like the humanities or film studies, are less precise, and you can feel less confident about what those so-called experts tell you. And regarding some topics, like theology, it is doubtful that the idea of an expert even makes sense.