While the irony of a man looking to improve educational standards in history being himself unable to spot a compromised source is one which hasn’t escaped the attention of commentators, oddly little spotlight has been shone on the source of the dodgy data itself. Ludicrous as it is that a senior politician in a position of power could be so embarrassingly unfamiliar with acceptable standards of research, it seems remiss not to question why marketing-led data ended up in a position to be taken seriously in the first place.

The Daily Mail, wisely, have yet to criticise Gove – smart, given that they previously published each of thePR surveyshe cited, and especially considering that in the last week alone they’ve published more than twenty two articles derived from the same questionable opinion poll mould. While the error from the Education Secretary was cringeworthy and troubling, there’s a degree of hypocrisy in the news media criticising Gove for believing what they presumably, at the time, felt was worthwhile enough to print.

The ease with which PR surveys pass into the mainstream media – and into the eager hands of Education Secretaries, it seems – would be less concerning if the research they presented was of a high standard. Unfortunately, for the majority of such opinion polls, commercial interest and crippling methodological flaws often render the results worthless.

OnePoll claim to have a community of over 100,000 users, with users paid around ten pence for each survey completed - with surveys regularly consisting of over a dozen questions. As moneyspinners go, it’s hardly lucrative. What’s more, users can only collect their earnings after accruing £40 - equating to roughly three hundred surveys and potentially hundreds of hours.

This low return on investment becomes an incentive to undertake as many polls as possible – and with limits to the number of users able to complete each survey, users are soon tempted to spend no time at all thinking honestly about their answers. If only the first 2,000 people earn 10p for their ten minutes of work, why be the person who lingers too long and finds they’ve wasted their time for no reward? This, combined with the ease of second-guessing the "desired" answer to screening questions (why yes, I DO have children under the age of seven…), almost certainly results in some polls being completed by users with no relationship to the subject matter and no concern for the answers they provide.

The problems don’t stop there - as any psychology student will tell you, any answer you receive can depend entirely on the question you ask and the way you ask it. Take, for instance, a question posed by OnePoll to new mothers:

Do you find you don’t actually care as much about your appearance now that you have had a baby?

A) Yes absolutely

B) No, I care, I just don’t have the time to do anything about it

C) Neither

(OnePoll, 13 May 2013, survey NH SDG 3004 VBM)

Once you pick through the positives and negatives, it’s clear that both of the first two options presume new mothers don’t take care of their appearance. While this may or may not be true, it’s not a finding which the question solicits – yet it would be easy for a PR firm (perhaps acting on behalf of a cosmetics or clothing retailer) to spin an article just as well from either response.

Similarly, multiple-choice questions can often force users into making fairly clumsy sweeping generalisations - such as the question posed to male users earlier this year:

Heaven help the men faced with that choice. Yet, with a shiny 10p on the line, and no other option available, a choice had to be made… and thus, as the Daily Mail declared a fortnight later, “Over half of British men think their mums are better cooks than their partners”. Somewhat lost in the write-up is that the men in question were forced into the choice, and only marginally more than half opted for the first option on the list.

The lack of transparency of survey method and availability of actual data in the final write-up is a further issue with PR polls, and brings us neatly back to those cited by Gove. Take the finding – highlighted as the very headline of both the original press release and the Daily Mail article – that Delia Smith was married to Henry VIII: nowhere is it reported how many of the 2,000 children to take the poll actually gave this as a response. It’s perfectly plausible – if not highly likely – that the main line of the news coverage of this story was generated by only a handful of respondents choosing this rather silly option. It’s equally likely, too, that the survey wasn’t of 2,000 11-16 year olds, but of 2,000 people who said they’d had their children present, some of which may not have been telling the truth. Those screening questions are, after all, not rocket science to figure out.

Most intriguing of all, then, is the possibility that the Education Secretary criticised the ignorance of children in this country based on number of newspaper reports of a minority of errant responses given to a PR opinion poll by people who weren’t children at all. Until there’s real transparency on the data, and better control over survey methodology, there’s no real way of knowing. This survey isn’t unique in that respect – there are dozens of similar cases featured in the news each week, often with significant potential flaws.

As for Michael Gove, he ought to be more cautious about believing what he reads in the newspapers, and should learn to be more skeptical of his sources. Those who don’t learn history are doomed to ignorantly repeat things.