Dubious statistics of the day, cybercrime edition

Peter Maass and Megha Rajagopalan of ProPublica, who have spent 3,700 words and some enormous amount of time trying to track down the source of dubious cybercrime statistics. "
data-share-img=""
data-share="twitter,facebook,linkedin,reddit,google,mail"
data-share-count="false">

I feel for Peter Maass and Megha Rajagopalan of ProPublica, who have spent 3,700 words and some enormous amount of time trying to track down the source of dubious cybercrime statistics. I went through something similar in 2005, looking at counterfeiting statistics: I can attest to how frustrating and thankless it is trying to follow footnote after footnote in a futile attempt to find something substantive amidst the exaggerated rhetoric.

The short story here: the US government loves to say that the cost of cybercrime in the US is $250 billion per year, while the cost of cybercrime globally is $1 trillion per year. The government loves to say that because it’s in the business of fighting cybercrime, and it loves to feel important. But in reality, those figures are more or less picked out of thin air, and have very little in the way of solid scientific basis. What’s more, they’re all sourced from for-profit companies with a lot of skin in the game: Symantec and McAfee, manufacturers of anti-cybercrime software.

The most interesting thing, to me, about the ProPublica report is that the $1 trillion number ostensibly comes from a scientific survey, but in fact comes from a press release which accompanied that survey. The survey itself never said anything of the sort. In that, it’s just like the $5 trillion which hedge funds are supposed to be managing in five years’ time. (Thanks, Citigroup, for inventing that figure and placing it in the WSJ and elsewhere.)

Is this something they teach at PR school? Commission a scholarly report, and then distribute it with a press release featuring eye-popping assertions to be found nowhere in the report? I suspect that it happens much more than most journalists would like to admit.

What’s worse, once public institutions have officially cited these bogus stats, they feel that it would be shameful to ever distance themselves from them — hence the unedifying responses in the ProPublica piece from US spokespeople, which basically amount to “hey, it’s not our job to check facts, if McAfee puts something in a press release, that’s good enough for us”.

The reason that PR types do this, of course, is that it works. They know that most journalists are much more comfortable working off a press release than putting the work into reading and understanding a long report; they also know that even if most journalists do read the report and steer clear of the story, that doesn’t matter so long as some journalists wind up falling for the bogus numbers. And most importantly, they know that they’ll never get punished in any way for putting out false or misleading information: while journalists are expected to check facts, PR people are shameless.

Which means that the conclusion to my 2005 piece is as true today as it was back then: if you ever see seemingly authoritative statistics being bandied around by journalists or politicians, always bear in mind that there’s a good chance they’re utter bullshit. Especially if they’re particularly striking, or don’t pass the smell test.