Sometimes polling is misleading rather than illuminative - especially if one is too desperate to find evidence for one’s own point of view. Consider this question, which ComRes ran last week: “Agree or Disagree: I would have considered voting Conservative at the next election but will definitely not if the Coalition Government legalises same-sex marriage”. 14% of those who had identified themselves as likely Conservative voters agreed. Some have concluded from this that legalising same-sex marriage will have a decisive negative effect on the outcome of the next general election.

It’s possible, and I can’t definitely say whether it will truly happen or not. But it seems so unlikely, so very far from any known voting behaviour, that it staggers me it’s taken seriously by anyone. Ask people what are the most important issues for them at the next election – the economy, the NHS, schools, tax, immigration, or same-sex marriage – and we all know which would come last. But put the question the way ComRes put it and you give the respondent a ‘free hit’ to vent some irritation. Especially when the warm-up question is the suggestive “Agree or Disagree: David Cameron's plan to legalise gay marriage is more to do with trying to make the Conservative Party look trendy and modern than because of his convictions.”

This is no disrespect to ComRes. It’s not unreasonable for them to run the question. It’s not an illegitimate question. Such questions can occasionally shed a little light. Maybe this one allows us to measure the extent of anger among some Conservatives. But no sane person would grant this any kind of predictive value. Rather obviously, the next election will not be won or lost on the issue of gay marriage.

All pollsters have been accused at one time or another of running bad questions. It’s an occupational hazard: a commercial firm will run questions that they consider legitimate, even if they do not consider them optimal. On complex issues, there is rarely a single correct way to frame a question. My own company YouGov was attacked recently for running two separate questions, for two clients with very different points of view, about regulation of the press. Each was framed to look at a particular aspect of the issue. Each tried to measure public response to a significantly different, but related, solution to the problem. They obviously had different, non-comparable results. We were accused of contradicting ourselves, but of course we had never made a statement, we only ran what we considered two perfectly fair variations of the argument. There was no one question that could tell you what the public ‘really’ thought – and anyway, the public probably did not ‘really’ think very hard about the nuances of press regulation in the first place.

This is part of my little series of “Myths About Polling”, and I’d like here to dispel two myths at once:

1) There is rarely a single ‘correct’ way to ask a question. Most issues are complex and a poll can only put forward some of the nuances. Every singular statement of a position pushes some aspect into the foreground and another aspect into the background. Even questions about simple choices can be hard to frame. For example, the classic ‘voting intention’ question usually begins, “If there was a general election tomorrow…” and yet we all know there is no election tomorrow. That is why the classic voting intention question asked half a year or more away from an actual election usually exaggerates the anti-government vote. Maybe it would it be more ‘real’ to ask “How do you predict you will end up voting?” But then, who knows what will happen between now and then? Pollsters have agreed that we are trying to measure the current state of feeling, not necessarily predicting the future.

2) The second myth is that polling firms are biased. Well, of course individual pollsters probably are biased one way or another, but most polling firms have a variety of professionals working on their research, often of very different views but sharing the same professional attitude. Of course the most important reason why pollsters try very hard to get it right is because their livelihood in the future depends on the quality of work they do today. At the start of the 2010 election campaign, YouGov was the first to show that the outcome was likely to be a hung parliament. Everyone else at the time had a clear Conservative win. We were vigorously attacked by many commentators, who pointed out that my colleague Peter Kellner was a Labour supporter. A few weeks later when we called the second TV debate for Cameron (by a whisker), we were attacked by Liberal Democrats who said we were obviously biased because my co-founder was running for the Conservatives. The fact is, what really motivates us is to be right. Why would we risk our reputations to provide an ephemeral sliver of momentary advantage for one of our one thousand clients?

So I don’t accuse anyone of biased polling – questions about same-sex marriage, alternative voting systems, regulation of the press or the future of Europe are highly variegated and subject to all kinds of changing whims and feelings. There is no ‘true’ picture of opinion, only indicators of dynamics. Polling is an important addition to the debate, but it is rarely definitive.

The comforting part of this is that MPs voting today on this emotive issue really shouldn’t be referring to opinion polls at all, and they certainly shouldn’t adjust their minds to fanciful predictions of future marginal voting events. They should just follow their own judgment.