Making your mind up

I have managed to steer clear of most of the election coverage in the media, for very much the same reason that I never watch BBC Question Time or Prime Minister’s Questions on TV. The tone of debate is generally appalling – adversarial and completely free of any evidence except that plucked and teased into shape by party hacks. An example can be found in the Institute for Fiscal Studies’ analysis of the Labour and Tory manifesto plans for the economy. They find that, after five years, the National Debt will be 77% of GNP if Labour win, and 72% if the Conservatives win. There is a debate to be had about which is the best approach, and you know where I stand. But can anyone but an ideologue truly believe that the Tories will lead us back to the 1930s, or that Labour will unleash an economic disaster, on the basis of that ?

Another example. The day after the election leadership debate, neutrals tended to feel that the leader of the Scottish National Party was most impressive, Cameron effective but not overwhelming, whilst noting that Miliband’s poll ratings had improved. However, the Sun featured a front page which said this was the day that Miliband lost the election. The Mirror, to no-one’s surprise, said that Miliband had come out on top.

Some of these reactions are simply cynical. Private Eye, for example, reports that the Sun headline was published before the actual event took place – whatever actually happened, Cameron was going to win. But, sad to say, I suspect that most of people’s views about winners and losers at the debate were sincere. Socialists really did think that Cameron was slimy and evasive, Tories similarly felt that Miliband was unimpressive and irresponsible. Lenny Bruce reported the same effect at the famous Nixon v Kennedy debates in 1960: Republicans thought Nixon trounced Kennedy, and vice versa. These are examples of what is known as confirmation bias, people’s habit of interpreting information in a way that supports their pre-existing beliefs in order to avoid cognitive dissonance. One experiment gave two groups of people the same articles and statistics about murder in different states of the USA, covering those which had the death penalty and others which did not. Those who were opposed to the death penalty thought the evidence showed that capital punishment did not reduce murder rates; those which did thought that the facts showed that it did. In another famous experiment in the 1990s, Lee Ross, a psychologist from Stanford University, took peace proposals authored by Israeli negotiators, labelled them as Palestinian proposals, then asked Israeli citizens to assess them. “The Israelis liked the Palestinian proposal attributed to Israel more than they liked the Israeli proposal attributed to the Palestinians,” he said. Mark Twain is alleged to have said that he never saw anyone’s mind changed by an argument, and it is hard to say he was wrong.

Of course, arguments based upon moral beliefs need not depend on logic. You may believe people should keep any money they earn, or that they should be taxed to provide support for others. That’s a value judgement, and you can’t argue one way or the other. What might be germane to this debate is evidence that high taxes reduce a nation’s wealth, or (by contrast) current levels of taxation have no effect on effort: but that wouldn’t be decisive to the underlying judgement you make about how we should behave, or how we should regard personal property.

But most arguments are not like this. Parties don’t differ about whether it’s good to have an efficient economy, or high quality education, a strong army, decent support for the old and disabled, or a fine health care system. The argument is about how best to achieve those agreed goals, and this is where evidence and logic need to come into play. That’s why confirmation bias – people’s inability to look at the facts in a disinterested manner – matters. It is worrying for those of us who would wish to believe that one of humanity’s greatest glories is our ability to use science and logic. What is the way forward if we believe that, generally speaking, truth is preferable to falsehood, and that evidence can be adduced to show what is correct and what is wrong ?

Here’s my suggestion. Before a debate starts, people should be asked – and you should ask yourself – “what piece of evidence would change your mind on this ?”. This would not work for many, I know: there are still those around who blame the credit crunch on Gordon Brown, and many US Republicans think that allied forces did find WMD in Iraq. And some simply deny truth – see the anti-vaccination morons. But for others, it would be good fun to keep the answers on record for a few years. Did introducing a minimum wage, or restricting fox-hunts, or ending duty-free concessions at airports and ferries, cause mass unemployment, as we were told they would ? Did Viagra bankrupt the NHS ? We have certainly been around long enough to show that (e.g.) those who said that quantitative easing would cause runaway inflation are wrong. Mind you, those that said it so forcefully mutter and walk in the opposite direction when challenged – or say “it will happen yet”.

I suppose the best safeguard is to have a personal sense that you’re not right all the time – in the words of Cromwell (not a great self-doubter, I know) – to think it possible that you may be wrong. Maybe jot down the things you’ve changed your mind about over the years, from the idiocy of gay marriage to the integrity of Robert Mugabe. Ask yourself what would make you change your mind on a much loved prejudice. A personal understanding of the dangers of bias is one of the best weapons against it. Carol Tavris and Elliot Aronson wrote a wonderful book “Mistakes Were Made (but not by me)”, which I thoroughly recommend. They conclude that “drivers cannot avoid having blind spots … but good drivers are aware of them”. Or, to quote Carlyle, “the greatest of faults is to be conscious of none”.