An extremist, not a fanatic

October 27, 2016

Resisting asymmetric Bayesianism

It’s a commonplace that politics is becoming more polarized and tribal: think of Remainers vs Brexiters or Trumpites vs Clintonites. This isn’t simply because people have retreated into intellectual trenches in which they associate only with the like-minded and get their news only from sources that echo their prejudices. It’s also because of another mechanism which is variously called attitude polarization, the backfire effect or asymmetricBayesianism.

This is the process whereby people who are confronted with evidence that disconfirms their beliefs do not become more open-minded as Bayesianism predicts, but rather become dogmatic. The earliest evidence on this came from a 1979 paper (pdf) by Charles Lord, Lee Ross and Mark Lepper. They showed people mixed reports on the effects of the death penalty. They found that, after reading these reports people who supported capital punishment became stronger in their support, whilst opponents of the death penalty also became more dogmatic. This arises from a form of confirmation bias: we look favourably upon evidence that confirms our position but sceptically upon disconfirmatory evidence.

Reasonable people lament all this.

But here’s my problem: I find it damnably hard not to be an asymmetric Bayesian. One reason for this is that having learned about cognitive biases I now see them everywhere. Yes, I know – that’s the confirmation bias. But of course, it’s easier to see them in others than in oneself: physician, heal thyself.

But my opponents don’t help me. Many who favour tough immigration controls, for example, look like hysterical nativists who don’t even have the wit to recycle Rawlsian law of peoples-type arguments. And many aren’t even trying to convince me; perhaps most political discourse is preaching to the choir rather than an attempt to persuade opponents. (The Left are probably as guilty as the right here).

So, how can I resist the pressure to be an asymmetric Bayesian? Here are four things I try:

- Construct one's own argument for something one doesn’t believe in. I tried this here, here and here.

- Look for weaknesses in your own theory. For example in yesterday’s post, I didn’t establish that people in power actually are racist, but merely expressed scepticism about the validity of other explanations. That wasn’t good enough. (In fairness, though, proof on this matter is perhaps unobtainable.)

- Look for disconfirming evidence. To take yesterday’s post again, I might have pointed out that Indian men earn more than similarly qualified white men. This might be evidence against employer racism. (But is it proof? It’s possible both that Indians sort into high-paying occupations and that they suffer racism which causes their wage premium over whites to be lower than it otherwise would.)

- Look for intelligent arguments against one’s case. The fact that most of one’s opponents are silly and hysterical doesn’t mean they all are. For example, I especially valued this post by Ben Cobley, and welcome the writing of Bryan Caplan and his colleagues or Andrew Lilico.

Of course, in saying all this I don’t claim to succeed in resisting the pressures of asymmetric Bayesianism. No doubt I fail.

But here’s the problem. My efforts to do so arise solely from intrinsic motivations. External incentives actually encourage asymmetric Bayesianism. The MSM select for overconfident blowhards; Toby Young, remember, actually earns a living. And most politically-engaged people on left and right would rather associate with a fellow partisan than a pointy-headed sceptic. When Michael Gove said that the country has had enough of experts, he was in part expressing a truth.

Comments

I say this in all humility - when I buy and sell shares there seems to be very little link between my research and the results I get. Perhaps reality is so messy that following gut instinct is as good a policy as any attempt at creating a rational framework?

Even if a rational framework is possible, intelligent people will have their own rational fraeworks and it seems impossible for them to see the value of a second one?

"When Michael Gove said that the country has had enough of experts, he was in part expressing a truth."

Ironically it is the very people who despair of people not paying attention to facts and experts, who keep using this quote to sum it up - when it's not what Gove really said (see https://www.youtube.com/watch?v=GGgiGtJk7MA for what he did say).

Even more ironic is that what he said has so far proven more accurate than the forecasts from the 'organisations with acronyms' that he was talking about.

I don't think you cite him, but as a callow youth I was tremendously influenced by Karl Popper who also wrote that if we want to progress towards the truth we should be actively seeking to change our views, looking for weaknesses in our own positions, strengths in others.

Like you I suspect I rarely manage to hold true to that ideal, and sometimes I fear that trying to has turned me into an eternal fence sitter. I perpetually struggle with the idea that some things must be worth fighting for, in whatever way available, yet that requires more certainty than I am usually able to muster. Or at least, if you want to fight, you usually have to throw in with the polarised positions available. This was my problem before the invasion of Iraq - although I think those who say that anybody with basic knowledge of USA and the world should have seen how it would turn out (i.e. D2, FR) are right, at the time I was uncertain and found merit is *some* of what the "decents" said. In retrospect that was a mistake, which might reveal a drawback of trying to be openminded? I dunno.

something else I think is helpful to do here is try to often admit error. Even if that just means acknowledging when you wrote something badly so it was open to misinterpretation - even that in my experience is too rarely done, and doing it (genuinely, it is easy to do insincerely) can stop you from solidifying around your own opinions.

A possible problem with the death penalty experiment. Differences in views on the death penalty may be more to do with relative values than the effects of the death penalty. So you might care a lot about evidence of one thing, but not care much about evidence of another. A perfectly rational response to mixed evidence showing both things could, therefore, be to firm up your position. For example, if you care a lot about unsound convictions but little about people that have committed horrible crimes being kept in a cell with a telly, evidence showing both would support your position against the death penalty.

I recently spent time looking at Global Warming - is it happening, is human activity responsible, what happens next etc. I ended up less sceptical than when I started, but what was noticeable is the amount of debate that happens over every single point. Just look at any subject on https://www.skepticalscience.com and the extensive comments.

What is really easy is to pick out papers that support your natural view and decide this person is the real expert, and all the rest are second rate.

It is remarkably easy to see those for whom facts and logic are irrelevant but there seems to be little reason to debate them. One can take it on as an educational mission but without much expectation of success.

In his obituary of Harry Johnson in 1977, Jagdish Bhagwati said that Johnson moved to the right after he went to Chicago because he was open minded. When his colleagues made a correct argument, he shifted his position, but when he made one, they did not. Is this not the danger of being a symmetric Bayesian in a crowd of asymmetric ones?

As a practising statistician, can I put in a plea for Bayesianism? With a 2-level outcome, a prior belief that is overwhelmingly weighted to one side will move to an oppositely weighted posterior only by the most overwhelming evidence. But one of the side-effects of fanatical belief is to self-censor news sources and debate and thereby to remove or weaken the strength of any evidence presented. Thus very seldom are strong believers moved from their position, not because the Bayesian approach needs adjustment, but because of the behavioural side-effects of strong belief.

Alan, the assertion above that Bayesianism predicts convergence of beliefs is false anyway. 'Bayesian epistemology' would only be useful as a naive or 'first order' model of cognition** and Bayesians have no trouble explaining / predicting divergence of beliefs*. There is no need for some 'new' idea - 'asymmetric Bayesianism' - at least not outside of the apparently epistemologically closed world of economics.

In a world short of demand those abroad have nowhere else to sell their goods other than here. So they will take less money for their stuff, because otherwise they will get nothing. They lose. Good day exporters!

It's time for the UK's buyers to earn their crust and screw down their suppliers hard.