All Facts Are Not Created Equal

Paul Krugman laments economic tribalism, the tendency of people to view arguments about economic reality through a preconceived notion of how the world works. His arguments are very similar to the complaints Steven Taylor and I have voiced over years about the “team sports” mentality in the discussion of politics.

Krugman leads us to some research which explains why this tribalism is so pervasive. Partly, he attributes this to what Richard Dawkins terms “argument from personal incredulity,” a refusal to accept facts that don’t jibe with their personal experience. For example, “Our intuitions about how business-y stuff works come from businesses or households selling their goods or labor to an external market. In such situations spending less is a sure-fire way to reduce debt, cutting your price or your wage demand is a sure-fire way to sell more. But in the economy as a whole, your spending is my income and vice versa; my wage matters only in comparison to your wage; and so on. This changes everything, which is why we have paradoxes of thrift and flexibility.”

Additionally, though, citing Justin Fox, he observes, “People aren’t very receptive to evidence if it doesn’t come from a member of their cultural community.” Fox in turn references a June 2010 essay in Nature [PDF] by Yale Law and psychology professor Dan Kahan examining the “culture war in America over science,” the introduction to which I excerpt below:

People endorse whichever position reinforces their connection to others with whom they share important commitments. As a result, public debate about science is strikingly polarized. The same groups who disagree on ‘cultural issues’ — abortion, same-sex marriage and school prayer — also disagree on whether climate change is real and on whether underground disposal of nuclear waste is safe.

[…]

[P]eople find it disconcerting to believe that behaviour that they find noble is nevertheless detrimental to society, and behaviour that they find base is beneficial to it. Because accepting such a claim could drive a wedge between them and their peers, they have a strong emotional predisposition to reject it.

Our research suggests that this form of ‘protective cognition’ is a major cause of political conflict over the credibility of scientific data on climate change and other environmental risks. People with individualistic values, who prize personal initiative, and those with hierarchical values, who respect authority, tend to dismiss evidence of environmental risks, because the widespread acceptance of such evidence would lead to restrictions on commerce and industry, activities they admire. By contrast, people who subscribe to more egalitarian and communitarian values are suspicious of commerce and industry, which they see as sources of unjust disparity. They are thus more inclined to believe that such activities pose unacceptable risks and should be restricted. Such differences, we have found, explain disagreements in environmental-risk perceptions more completely than differences in gender, race, income, education level, political ideology, personality type or any other individual characteristic4.

Cultural cognition also causes people to interpret new evidence in a biased way that reinforces their predispositions. As a result, groups with opposing values often become more polarized, not less, when exposed to scientifically sound information.

Fox notes that the degree to which this was true “ was brought home by an experiment I inadvertently unleashed last week.”

I wrote a post here at hbr.org on whether the Internet era has been a time of world-changing innovation or a relative disappointment. It was inspired by comments from author Neal Stephenson, who espoused the latter view in a Q&A at MIT. His words reminded me of similar arguments by economist Tyler Cowen (if I had enough brain cells to remember that Internet megainvestor Peter Thiel had been sayingsimilarthings, I would have included him, too). So I wrote a piece juxtaposing the Stephenson/Cowen view with the work of MIT’s Erik Brynjolfsson, who has been amassing evidence that a digitization-fueled economic revolution is in fact beginning to happen.

If I had to place a bet in this intellectual race, it would be on Brynjolfsson. I’ve seen the Internet utterly transform my industry (the media), and I imagine there’s lots more transforming to come. But I don’t have any special knowledge on the topic, and I do think the burden of proof lies with those who argue that economic metamorphosis is upon us. So I wrote the piece in a tone that I thought was neutral, laced with a few sprinklings of show-me skepticism.

When the comments began to roll in on hbr.org, though, a good number of them took me to task for being a brain-dead, technology-hating Luddite. And why not? There’s a long history of journalists at legacy media organizations writing boneheaded things about the Internets being an abomination and/or flash in the pan (one recent example being this screed by Harper’s publisher John McArthur). Something about my word choices and my job title led some readers to lump me in with the forces of regression, and react accordingly.

When I saw that Wired.com had republished my post, I cringed. Surely the technoutopians there would tear the piece to nanoshreds. But they didn’t. Most of the Wired.com commenters instead jumped straight into an outrage-free discussion of innovation past and present.

That’s probably because, if there is one person in the world whom Wired.com readers consider a “knowledgeable member of their cultural community,” it is Neal Stephenson. This is the man who described virtual reality before it was even virtual, after all. I’m guessing that Wired.com readers were conditioned by the sight of Neal Stephenson’s name at the beginning of my post to consider his arguments with an open mind. Here at hbr.org, where we don’t require readers to have read the entire Baroque Cycle before they are allowed to comment, Stephenson was just some guy saying things they disagreed with.

I’ve experienced an odd variation of that at OTB in recent years. Because I’ve long since ceased to be a reliable member of either team, my arguments are viewed suspiciously. Democrats see me as someone posing as a “reasonable conservative” who is actually on a secret mission to push Republican talking points whereas Republicans view me as a pawn of the Obama administration. After all, there’s no such thing as a person who simply interprets facts about politics honestly.

And that’s people who have some passing familiarity with who I am. Those who come in to the site for the first time from a link somewhere just assume I’m on whatever team my arguments seem to support. In the case of this particular posting, the mere fact that I would cite Paul Krugman without some sort of derogatory string of adjectives (i.e., crazed liberal hack Paul Krugman, who somehow got a Nobel Prize) colors the entire rest of the argument.

Amusingly, Kahan’s article points back to yet another bit of research that closes our circle:

In a famous 1950s psychology experiment, researchers showed students from two Ivy League colleges a film of an American football game between their schools in which officials made a series of controversial decisions against one side. Asked to make their own assessments, students who attended the offending team’s college reported seeing half as many illegal plays as did students from the opposing institution.

Democrats see me as someone posing as a “reasonable conservative” who is actually on a secret mission to push Republican talking points

No, I see you as a conservative who’s wrong a fair amount of the time, but honestly wrong. If you were just a mindless pawn pushing talking points, I doubt that I’d bother engaging with your posts. Same goes for Doug.

If you want to see an example of someone who’s less tribal than either of you, peruse your own blog and check out posts by that Tyler fellow.

At heart, I am just a numbers guy, I think. It sometimes amazes me how we can look at the same set of numbers and come to such different conclusions. I think that means I could be wrong, so I try to engage with others who have drawn different conclusions. It has caused me to modify my views.

On foreign policy, I think there is more subjectivity in taking one’s positions. Still, even in this area there is a lot of history to remember and understand. Smart conservatives who attempt some semblance of neutral analysis rather than partisan cheering (this would include James) have also changed some of my views.

I guess in short, I think that if you find that engagement with others never leads to changing your mind, something is wrong. I know I am not right 100% of the time. My wife reminds of that all of the time. If I know I can be wrong, then others must be right.

And here I thought I liked you specifically because you were doing your best to analyze events and facts fairly, regardless of how the “team” wants you to respond, with the tacit acknowledgment that you (like all of us) have your own set of cultural and personal biases that will color your understandings. Please, just keep doing what you’re doing. Even when I disagree with you, I usually learn from you.

I think we are seeing a perfect example of what James describes in the current election. The extent to which you attribute the net loss of jobs and large deficits that have occurred during the Obama administration to Obama’s policies or to the massive downward momentum that began before he took office pretty much indicates what team you’re on.

Democrats see me as someone posing as a “reasonable conservative” who is actually on a secret mission to push Republican talking points

No I don’t James. I see you as someone who once was an uniformed hack but thru interaction with the superior intellects of the Left you have modified your views to sometimes admit we actually have an argument which on occasions is correct. Some day soon you will no longer be able to deny that we are always right. 😉

I often discuss issues in the context of motives because I’m trying to point out the bias in a person’s thinking that causes them to skew one way or another. I actually think you, James, have rather less bias than most. I think what sets you apart is that you are at least trying to be fair and just, you’ve set that as a conscious goal. You’re actually looking for the truth. Which doesn’t mean you’re free of bias (no one is) but does mean that you are a giant step closer than most.

It’s important to decide first to seek truth. Then the seeker searches out their biases, abandons as many presuppositions as they can identify, and only then begins to rebuild. I believe the single biggest impediment to this is religious belief. It’s very hard to simultaneously believe in highly improbable fantasies and devote oneself to truth at any cost.

But there’s a particular bias at work on the internet: engineer’s bias. Engineers see every system as a sort of limited, defined machine, a series of pulleys and belts controlled by levers and buttons. Removed from that system is the human being, that creature of imagination and stupidity, wild emotion and herd instinct. So they try to understand the economy using numbers alone and forget that the economy is not separate and apart from society or civilization and is no more rational than anything else involving humans. They understand the internal combustion engine thoroughly and forget that there’s some jackass driving the car who can utterly change the real-world use and effect of that engine.

[P]eople find it disconcerting to believe that behaviour that they find noble is nevertheless detrimental to society, and behaviour that they find base is beneficial to it. Because accepting such a claim could drive a wedge between them and their peers, they have a strong emotional predisposition to reject it.

I think this is important. My sister is a brilliant scientist who has worked in the oil industry for 40 years. The scientist within knows that peak cheap oil and anthropomorphic climate change are real. The inner tribe member has trouble accepting it although with retirement only a couple of years or less away it’s getting easier for her.

@Ron Beasley:
One of the reasons I’ve always been so fascinated by the Righteous Among the Nations, those who rescued Jews in WW2, is that it demanded of these people an ability to put core moral beliefs ahead of self-interest and tribe. They risked their own lives, often the lives of loved ones, in defiance of “their own.” Amazing human beings.

Then the seeker searches out their biases, abandons as many presuppositions as they can identify, and only then begins to rebuild. I believe the single biggest impediment to this is religious belief. It’s very hard to simultaneously believe in highly improbable fantasies and devote oneself to truth at any cost.

I would suggest a better formulation would be to replace religious belief with dogmatic belief.

A scholar/truth seeker’s eye can be brought to any subject. And some of the finest thinkers through the ages have not only been religious, many brought that approach to their own religions.

The problem is that too many believe that the only faith is that of a child’s, unquestioning, 100% secure. And that’s where the problem occurs. When one moves from belief/faith built upon struggle and inquisition to belief/faith built upon dogma, then one can no longer possibly see truths, even when they are laid bare.

I agree, many, probably most of the great thinkers have been religious. But it nevertheless creates a problem requiring a two-track thought process: 1) Empiricism, 2) Faith. They are pretty much opposites.

The effect can distort even brilliant minds, with Pascal’s wager being a classic example, which begins with assumptions that traps the great man in nonsense. Essentially, “Why not believe in God since belief may result in heaven.” (Paraphrasing.) Well, because the same logic can be applied to any belief that holds the possibility of profit. It’s a hole you can drive a truck through. Why not believe in Leprechauns, maybe you’ll get a pot of gold? Why not believe in that nice Mr. Lenin, maybe we’ll have perfect equality? Come to think of it, why not believe in Satan since one has to assume he protects his most fervent believers in the event that hall is real?

@michael reynolds: Perhaps some of this comes down to different understandings of the foundations of “faith.”

From my understanding, and I’m not a theologian, faith is not simply belief in the absence of evidence. Rather, I understand faith as something that grows from and through a deep and painful interpretation of evidence.*

In terms of Pascal’s wager, that to me is not a particularly good example of faith or strong Relgious thought in action. If anything I would call it an example of cynicism at its worst.

I will also freely admit that Religous institutions have a tendency *not* to develop particularly healthy forms of truth seeking in many circumstances. But that, in my mind, is an implementation bug (due more to the “institution” part than the “religion” part).

I also tend to think that the binary or dualistic though process you are referring to is more a byproduct of European thought than necessarily Religious thought. Remember that the majority of those religions came of age in Europe. Likewise this same sort of Dualism exists (or is taken to exist) in much of European philosophy.

When you look more broadly, across thought and religions in other parts of the world, you find a number of systems where things are far more nuanced or at least less oppositional.

@michael reynolds: I think the best response to Pascal’s wager was Terry Pratchett’s retelling of the story, where the philosopher wakes up after death in the midst of some very pissed off gods armed with large clubs: “we’ll show mr. tricksy philosopher what we think of his logical arguments….”

It was a pretty good post addressing some human tendencies. However it contains some more human tendency flaws, for example the desire to simplify things. Some of it it is due for the need to be concise but much of it is not. You talk about people using household economics for their perspective. However you oversimplified it as well. Sometimes cutting certain household expenses doesn’t save you money in long run. Cutting unnecessary or luxury expenses would. You naturally oversimplify macro economics as well. All in order to help prove your point.

Another tendency that was touch on somewhat in the post is that people tend to blindly believe what is written or produce by what is considered an authoritive entity and sometimes even from an unknown entity. Just because it says so in a newspaper, study, their party leader, etc doesn’t make it so.

Using critical processes and being open minded is the key to seeking truth. However people are usually not concern with the truth but in their team winning.

One more tendency is people make it about themselves and throw in how it affects them when discussing almost anything. Doing so often, how do I say this, puts them on a team.

To throw myself into this, I often get frustrated with people when even after I tell them when they can’t understand the difference when I am discussing something analytic and when I discussing something from a personal perspective or belief system. They are not the same.

@mattb:
I’m not blaming dualism on religion. Nor am I holding up Pascal as an example of religious thought. (I think you’re right, it’s a masterpiece of cynicism and too clever by half.) What I’m saying is that presuppositions — unexamined assumptions — distort like a black hole in the mind, bending reality in ways the individual may not recognize. In his cleverness Pascal reveals the presupposition.

The degree of thought or struggle involved in a decision to accept the reality of things for which no evidence exists does nothing to alter the fact that it is still just a baseless assumption. People don’t come to a belief in God through evidence — there is no evidence — but through faith. It is in no way superior to faith in the return of Quetzlcoatl. The “struggle” involves subduing the need for evidence, in effect suppressing empiricism in favor of faith.

Newton was a man of faith. Also a somewhat noted mathematician, I believe, in fact often credited with doubling mathematical knowledge during the course of his lifetime. Smarter than thee or me? Uh, yeah, just a wee bit. He also wasted an enormous amount of time looking for secret codes in the Bible and trying to discover when the second coming would take place. So, capable of supreme empirical insight and yet obsessed by issues of faith. It doesn’t seem to have hurt his calculus, but by taking precedence over his more useful work it may have delayed even greater scientific insights.

Belief in God requires an acceptance of things for which no evidence exists. This creates a paradox in a thinking person who demands evidence in other areas.

Firstly that wasn’t what you said. You referred to “devote oneself to truth at any cost”. Since a devotion to truth is a subjective process, obviously someone can devote oneself to truth while also looking for e.g. religious truth. You conflated truth with evidence.

Secondly there has never been a requirement for human beings to be stringent in every aspect of their lives. It is not at all difficult to consciously exempt a specific area of your thinking from standards you try to apply everywhere else. Just ask any parent, sports fan or ethics teacher :-).

As long as the area is clearly defined and the assumption both consciously and openly this creates no specific problems.

You referred to “devote oneself to truth at any cost”. Since a devotion to truth is a subjective process, obviously someone can devote oneself to truth while also looking for e.g. religious truth.

So you suggest that we can devote ourselves to “truth at any cost” so long as the cost doesn’t include putting at risk our presuppositions? A devotion to truth at any cost includes obviously a possible cost to the individual in terms of his belief system. If he’s carving out “no go” areas it’s not any cost, it’s at any convenient, easily-bearable cost.

The degree of the devotion may be subjective, I don’t believe truth is. I still believe in objective reality, actual, Capital T truth, otherwise it’s not a search for truth, it’s a search for gratification or what have you. If there’s no truth there’s not much point searching for it.

Secondly there has never been a requirement for human beings to be stringent in every aspect of their lives.

Well, there is such a requirement if one wants to be stringent. No one’s saying there ought to be a law, but if you want to pursue truth at any cost, yeah, a certain degree of consistency and even ruthlessness is required. That’s hat makes it, “at any cost.”

The “struggle” involves subduing the need for evidence, in effect suppressing empiricism in favor of faith.

Man, if I had to do everything in my life based on evidence, how the hell would I ever meet someone for lunch? I suppress the need for evidence constantly, as does everyone. It’s the only damn way you can get along with people. How else are you even supposed to meet new people if you don’t have a marginal faith in courtesy or in society’s laws to act as a deterrent against people maiming each other for shiny things?

Faith, empiricism, they have their uses. As Paracelsus said, “Poison is in everything, and no thing is without poison. The dosage makes it either a poison or a remedy.”

No one’s saying there ought to be a law, but if you want to pursue truth at any cost, yeah, a certain degree of consistency and even ruthlessness is required. That’s hat makes it, “at any cost.”

Given that there’s no person alive nor has ever been that meets this standard the premise seems somewhat irrational.

So you suggest that we can devote ourselves to “truth at any cost” so long as the cost doesn’t include putting at risk our presuppositions?

Putting them at risk is completely acceptable. My main point was that, as far as I am concerned, a “seeker of truth” may not ignore scientific evidence but he is not forced to limit his mindscape to things that can be conclusively proven using the scientific method. There is a difference between scientific positivism and rational thought.

Man, if I had to do everything in my life based on evidence, how the hell would I ever meet someone for lunch? I suppress the need for evidence constantly, as does everyone. It’s the only damn way you can get along with people. How else are you even supposed to meet new people if you don’t have a marginal faith in courtesy or in society’s laws to act as a deterrent against people maiming each other for shiny things?

Well, you could rely on past experience that in general people show up for lunch dates, and on your memories of delicious sandwiches.

In fact, people don’t generally maim each other. Empirically it’s pretty rare. I’m going to dinner tonight with a former editor. He didn’t maim me last time we met, so I’m not expecting to be maimed this evening.

However, if I am maimed, I’m going to think it’s pretty suspicious you having some kind of knowledge before the fact. Oh, yeah, if I’m maimed, totally going to come looking for you, Tillman.

@michael reynolds: I often discuss issues in the context of motives because I’m trying to point out the bias in a person’s thinking that causes them to skew one way or another.

Sounds to me like you’re more concerned with who you’re talking with than what they’re saying. And by questioning their motives, you’re impugning their statements.

And your description of Pascal’s Wager matches my own thoughts about it… until I did a bit of research on it. Pascal did not say one should believe in God simply based on the odds, but one should act as if God was real. He didn’t address “faking belief” at all.

He elaborated that he hoped that the “acting” would make the person more likely to take the leap into true faith, but at no point recommended people fake a faith in the hopes of divine reward.

Again, not looking to pick a fight with you over Pascal — I had the same opinion of it for years and years. But a little digging one time (when I was making the same argument you are here) showed me I had it wrong. You might wanna do the same.

The problem is human perception. The mind makes shortcuts as it tries to understand all the various inputs from human sensation. It tends to try to fit information into preexisting mental patterns (mindsets). This effect is well established in scientifically. Here’s a cool video explaining one simple example – the McGurk Effect.

So, the problem with empiricism is that our perception isn’t empirical. And our mind tries hard not be be empirical. That’s why we have double-blind studies – we can’t trust ourselves to run experiments without unconsciously influencing the results. People have great difficulty perceiving possibilities that fall outside of established mindsets.

And this is why it is usually such a bad idea to focus on motivations. If one believes that Republicans are serial liars who hate women, or believes that Iran is intent on acquiring a nuclear weapon, or believes that his wife is loyal and couldn’t possibly be cheating, then the tendency is to automatically interpret information to fit with those mindsets. Others, operating under different mindsets, will interpret the information differently. Assuming that that such differences in interpretation are explain by motives is a major cognitive mistake.

Also, I don’t really agree with you regarding religion. Faith is a part of human existence, and it’s the result of mental process described above. Religion isn’t cognitively any different than other kinds of “faith.”

Finally, two last points. First, some things aren’t empirical. Secondly, science, unless it is the result of rigorous double-blind methods, is going to be flawed. Empiricism is a goal we usually cannot achieve.

I read Husserl while in 11th grade at a private school for annoying little smart kids in DC that my parents got me into an effort to keep me in school. Didn’t work. I chose to drop out and sell Barbies and Baby-G-Bye-Byes at Toys R Us in Marlow Heights. Making $1.60 an hour working retail at Christmas is still better than reading Husserl.

@michael reynolds: And you have such infallible judgment, do you? You immediately decide who’s right and who’s wrong, then act based on that.

I just challenged you on Pascal’s Wager, clearly and politely told you you were wrong. And instead of double-checking your own beliefs, you brush it aside. What does that say about your motives for repeating something false, without bothering to confirm your own beliefs? Would you like a link or two to back up my interpretation?

As an armchair psychologist, I’d say it’s arrogance bordering on narcissism — you’re so absolutely convinced that you’re right, you can’t even acknowledge you might be wrong. And if you are, it’s not relevant. Quite the defense mechanism.

@Andy:
I think you actually argue against your own point. The fact that we are necessarily a subjectivity, that we will (deliberately or not) distort our perceptions, argues that understanding biases is important. It’s part of the intellectual forensics of understanding how error occurs. When you see light bending you go looking for a cause and discover that space can be curved. Discovering a bias in ourselves is necessary to avoiding its effects in the future; discovering it in someone else is a clue to locating the error and exposing it.

By the way, I don’t want any of this to sound like I’m posturing as someone who is bias-free. I have my share. I try to see them and eliminate them or at least label and contain them, but of course I fail as often as I succeed.

And you have such infallible judgment, do you? You immediately decide who’s right and who’s wrong, then act based on that.

I don’t think I said anything about infallability. I’m not claiming to be the pope.

As an armchair psychologist, I’d say it’s arrogance bordering on narcissism — you’re so absolutely convinced that you’re right, you can’t even acknowledge you might be wrong.

Actually, I frequently admit I’m wrong and I beat myself up for it, sometimes for years. I was wrong about Iraq, I was wrong about true love, wrong about having kids. I cast my first vote for Richard Nixon. Dude, I have a looong list of the times I’ve been wrong. But your diagnosis of arrogance? Well, duh. I’m an author. I’m still waiting to meet my first humble one of those.

The fact that we are necessarily a subjectivity, that we will (deliberately or not) distort our perceptions, argues that understanding biases is important. It’s part of the intellectual forensics of understanding how error occurs.

I don’t disagree, but biases are not the same things as motivations. It’s one thing to point out that someone’s interpretation of evidence is flawed, or incomplete, or not the only possibility. It’s quite another to suggest (or declare, as is often the case in internet debates) that one knows the motivations behind the other person’s perceived bias. We often don’t fully understand our own motivations, since cognitive biases are unconscious, so attempting to understand the motivations of others becomes very difficult, to say the least. In geopolitics, it’s downright dangerous.

@Jenos Idanian: Acting as though there is a God even though you don’t believe in one is faking it, in the hopes that god wouldn’t know the difference if she really does exist. And that’s the flaw in the reasoning: that god wouldn’t mind being used as an insurance policy.

@Ben Wolf: You’re not quite catching the point. As I read Pascal’s piece, he seemed to be saying “behave as if there was a God” — in other words, act as if you will face eternal judgment for your actions. There’s something about having someone looking over your shoulder that tends to put most people on their best behavior.

He doesn’t advocate pretending to have faith, or representing one as devout when one isn’t; it’s all about conduct. Don’t be dishonest to others or to a possibly-nonexistent god; just be a decent human being.

I have Neal Stephenson, Tyler Cown, and Justin Fox in my Google Reader, so … maybe I shouldn’t even pause to say I like the introduction 😉 According to these rules my endorsement is suspect.

But seriously, we should all understand this as the backdrop for our OTB discussions by now. We saw it oh so clearly in “economic” articles about whether stimulus works, and “historic” opinions of what ended the Great Depression. (Other classic OTB examples appear up-thread.)

Oh, and as I’ve mentioned before, I think the answer to the jumping-off question, about whether technology change has slowed, is no.

It’s another cognitive flaw. “Today is slow” folks are unconsciously comparing changes this year to changes that happened in whole centuries. “Look at the 19th century” they say, and sometimes overtly “compare that today!”

Well, the 19th century was 36,500 times longer than today. It should have more in it.

Matt, as an atheist I have to ask (no snark intended), what evidence? I am truly curious.

Admittedly, the evidence is purely subjective — the notion of seeing God in things. Or in some cases in the spaces between facts (sort of like the notion of “something magic happens” in the cartoon with the two mathematicians working on a complex formula). One example of this later type is the watchmaker analogy.

I fully admit that to an atheist, this can seem weak evidence indeed. Or in some cases this same “evidence” may be used to justify one’s atheism.

As far more compelling take stuff on this, I suggest looking in some of the writings on faith by C.S. Lewis or Reinhold Niebuhr.

He doesn’t advocate pretending to have faith, or representing one as devout when one isn’t; it’s all about conduct. Don’t be dishonest to others or to a possibly-nonexistent god; just be a decent human being.

The issue, depending on your brand of Christianity, is the question of salvation through belief versus salvation through good works.

To really simplify things, beyond the question of indulgences and other Church excesses, this was at the center of the Protestant reformation.

Luther’s main beef was his argument that Christians are saved by belief alone. Good works, in Luther’s mind, while something to be encouraged, cannot win points with God. It’s a binary: you either believe or you don’t. And to Ben’s point, if that belief is based on a cynical wager (or more accurately cost/benefit analysis) than undercuts the entire system.

BTW, this is what Calvin turns up to 11, suggesting that nothing you can do — including believing — has any effect on your salvation.

@Jenos Idanian & @Ben Wolf: I should note that Pascal did address the issue of cynical belief and suggested that it by itself would not be good enough for salvation. His answer was, if you accepted his wager, to then double one’s efforts to come to God and transcend what he saw as an overreliance on reason through the cultivation of faith.

I think that one of the things that’s missing from this discussion is that we don’t have the equipment to perceive facts. We only have the equipment to perceive events and those only partially and incompletely. Once perceived we can never recapture the events—playing them over in our minds creates interpretations of events. It does not create facts.

All of us believe things to be facts that may be no more than botched interpretations. That’s true of me, that’s true of Michael, that’s true of James, that’s true of Dr. Krugman.

If you assert something that’s not verifiable as fact, that’s a belief. Even interpretations that are verifiable may or may not be facts. They’re not; they’re working hypotheses. Good working hypotheses get us safely to the moon. Or across the street.

I think there’s a sort of hierarchy of credibility. Things that are neither verifiable nor disprovable fit into a class of their own. Are they fictions? Facts? We don’t have the equipment to tell the difference. Tribal culture does not give us that equipment. Believing in what the tribe believes in does not identify facts. It signifies membership in the tribe. That’s useful, too.

Because societies are much more complex than the harder sciences like physics and biology, the author believes that attempts to apply the reductionist methodology will be doomed to failure. He writes that skepticism should be the order of the day in considering claims for the efficacy of new programs that ostensibly need scientific testing for validation. Manzi contrasts the application of the methods of the controlled experiment from the biological sciences with efforts to apply random-testing methods to criminal-justice, education and social-welfare programs. The author argues that the ultimate decisions on the application of such methods are “outside of science”—they are political and depend on answers to the question of “what kind of society we want to build.

There IS such a thing as science, and it CAN establish facts. It is a fact that water melts at 0 degree c (at standard pressure). It cant’ be established in the same way that stimulus “works” (in no small part because there is no equivalent to “standard pressure”).

Of course there’s such a thing as science. And, more importantly, engineering. Is your example a fact? We do not know. We can infer it but we cannot prove it. Inductive reasoning may suggest it’s a fact. It’s worked every time it’s been tried. That’s good enough for most purposes.

My point, really, is that the word “fact” gets thrown around a lot. It’s used to refer to all sorts of beliefs, speculations, working hypotheses, even figments. I’m reasonably comfortable with treating your example as a fact, sufficiently caveated (as you did). But I think we should be very careful about what we think of as facts.

If you can’t buy that as a fact, you are making the argument that there are no facts.

We do not know. We can infer it but we cannot prove it. Inductive reasoning may suggest it’s a fact. It’s worked every time it’s been tried. That’s good enough for most purposes.

You actually bolded “we do not know,” on the subject of when water melts(!)

I don’t know man, if you are just using a different scale, treating 0 degrees C as “highly reliable” it may just be a difference in semantics.

When something is that reliable, I think the word we use is “fact.”

My point, really, is that the word “fact” gets thrown around a lot. It’s used to refer to all sorts of beliefs, speculations, working hypotheses, even figments. I’m reasonably comfortable with treating your example as a fact, sufficiently caveated (as you did). But I think we should be very careful about what we think of as facts.

Certainly “fact” gets thrown around, but my position, and I think the one argued by Manzi, is that we should be very conscious of where facts stop.,

@john personna: Dave is talking phenomenology, not science. Yes, in the past, we know water changes states at a given temperature and pressure. We don’t know with a degree of perfect certainty that this will be the case in the future, merely that it’s the prudent operative assumption.

Oh I think the line is bright, but that it stops early. If I, myself, take a temperature with a calibrated thermometer (not a caveat, a condition) then I can treat the reading as a fact. This is fundamental to our world. Hell, a coyote takes the smell of a rabbit as fact that a rabbit has been near.

If you can’t buy that as a fact, you are making the argument that there are no facts.

Well, there’s the case of what happens to one molecule of water in isolation at 0 deg C. Ice, is after all, a crystal. Can you have a crystal from one molecule? Then there’s also this.

And even that conclusion is not an absolute fact since it was arrived at via modeling. In reality, we don’t know what the real freezing point of water is, but we can get close enough in 99.99% of situations.

(Supercooled and superheated fluids are unique in their conditions. The science is understood, and it did not change “freezing point” or “boiling point” in the hundred years or whatever since the discovery.)

@john personna: There have quite a few discussions on various constants not actually being constant throughout the universe (or in parallel universes). I guess my only point here is that you have to add a lot more caveats than just “at standard pressure.”

On the other hand, the Celsius scale used to actually be defined by the melting point of water at standard pressure, so I guess by definition it used to be a fact.

Think back. The great leap force in western science came in the 17th century, growing out of the enlightenment, and with the establishment of the royal societies.

In fact, the first two items in SparkNotes’ “roots of the enlightenment” are:

1605 Kepler discovers first law of planetary motion
1609 Galileo develops his first telescope

What we have there is fact as established by direct observation, and then attempt at understanding.

Going back to the transition from fact to uncertainty, I fear we are losing that grounding. It’s one thing to name corner cases for freezing point, as an intellectual game, but you kind of need to understand the nature of reality to even play that game.

(lolz, parallel universes. I think some tax plans come from there as well.)

No. Changing the frame of reference to quantum probability does not change the macro reality.

Well, first of all, the absence of a necessary impurity for water to crystallize at 0 celcius is not a question of quantum probability. Secondly, “water freezes at 0 degree Celsius” is only a “fact” given a certain set of assumptions. “Facts” rely on such assumptions and when those assumptions are no longer operative, then the “fact” changes or is no longer valid. We can use “water freezes at 0” as a rule of thumb for the vast majority of situations not because that is some kind of universal truth, but because the conditions that give rise to that “fact:” are exceedingly common in our experience and therefore applicable in most situations.

This problem of unstated assumptions exists practically everywhere and is a much bigger factor in determining how valid “facts” are in areas outside the hard sciences. What “facts” are there in, say, economics?

And finally, even if people can agree on facts, there can still be disagreement over the meaning of facts, analysis of facts and opinion based on facts.

The statement that “water freezes at 0 Celsius” is a conditional statement. It is a statement that is “close enough” in most cases that we don’t really have to think about it. If that kind of “close enough” constitutes a “fact” then I guess we agree.

@Andy: True. Everything is ultimately a conditional statement, conditional to the assumption we accurately perceive the universe rather than the machinations of Descarte’s evil genius. Nothing can be known with absolute certainty.

@Jenos Idanian: Acting differently, as mattb correctly points out, was not Pascal’s ultimate goal: becoming a prrson of faith by acting differently was endpoint for him. Pascal acknowledged that simply being nicer because you’re afraid you might be punished was insufficient.

I think there’s a difference between, say, the freezing point of water, which can be affected by various factors, and something like the direction the Earth orbits the Sun, or the fact that the earth spins on it’s axis.

@Andy: I suppose it depends: there are conditions which also affect earth’s rotation. Every rocket we launch, for example, steals a tiny but real quantity of energy and therefore slows that rotation down. Change the forces acting on the earth and it’s elliptical orbit changes.

Again, no. What you are doing is taking a general statement and calling it a conditional statement.

You are naming things that are each separate and equally known facts, and then claiming that they disprove each other. Chemists have used tables of melting point and boiling point for hundreds of years. They are useful facts. That under special conditions liquids can become supercooled or superheated are additional facts which do not change the first set.

One of the reasons chemists use melting point, and I used melting point in my first comment on the subject, was in fact because supercooling is not an issue.

You’ll have to work much harder to find conditions where water does not melt at 0 degrees c, but even if you do, you won’t be disproving the general statement, for humans, on earth.

Here’s the experience of “facts on the internet” that I’m feeling today.

Rule One, the most simple fact will draw out dissent. You can say “the sun rises in the east” and someone will pop up and dispute the generality, with fine points of compass and celestial mechanics. To save you time:

Actually, the Sun only rises due east and sets due west on 2 days of the year — the spring and fall equinoxes! On other days, the Sun rises either north or south of “due east” and sets north or south of “due west.”

So I was wrong!!!! The sun does NOT rise in the east!!!! etc.

Rule Two, fringe ideas can be represented as facts, and the same sort of provocateur will ring in an endorsement. “Sometimes the sun rises in the west.” “Yes, I’ve seen it too!”