The Flawed, Statistically Silly New Study That Calls the Republican Party More Dishonest

A new study out today proclaims that the Republican Party is much more dishonest than the Democratic Party. To which I say, meh.

The report, from George Mason University’s Center for Media and Public Affairs, captured lots of headlines, and the group did a lot to make the conclusions sizzle, by using phrases like “Republicans lie more” and “Republicans are less trustworthy than Democrats” in the press statement announcing the study.

Beneath all the hype, though, this report struck me as one of the silliest statistical analyses I’ve seen in a long time. While I do think that the truthfulness of the G.O.P. has sunk terribly in the era of Tea Party delusions, studies like this one detract from the real job of trying to keep politicians honest.

Here’s what’s wrong with the study. Well, almost everything is wrong with the study.

To make its assessment, C.M.P.A. relied on a fact-checking group known as PolitiFact, which examines political statements for their accuracy. The center then selected a period—January 20 through May 22 of this year—and reviewed 100 political claims examined by Politifact during that time. That included 46 statements by Democrats and 54 by Republicans. Then it toted up the number of statements deemed by PolitiFact to be false.

Calling this research stretches it. I think a better term is “counting.”

Let’s start digging into this. What standards does PolitiFact use in choosing its statements to review? In an e-mailto Poynter’s Andrew Beaujon, PolitiFacts editor Bill Adair says that the comments are selected based on the group’s news judgment. That’s fine for examining the issues of the day, but it hardly lends itself to statistical analysis. If someone’s subjective opinion determines the data set, the statistics are flawed from the get-go.

Then another level of subjectivity is employed: PolitiFact’s judgment on truthfulness. There has been plenty of criticism—from both the left and the right—of PolitiFact’s judgments. While that might be political sour grapes, it means that the group’s determinations are not objectively accepted fact. So now you have two subjective elements—the choice of statements to review and the determination of their accuracy.

The rest of the C.M.P.A. study is equally flawed. Why review 100 instances? Because the number sounded good? The basis for such a selection is supposed to be what is called “statistical significance,” and there is nothingthat would indicate that 100 statements—filtered through two levels of PolitiFact’s subjective judgment—constitutes anything of any statistical value. How many political statements must be made every day? Thousands? Tens of thousands?

And why from January through May? If I was willing to ignore the glaring statistical flaws, I could use this report to equally argue that the party out of power is more likely to make untruthful statements in the first months of a new presidential term. How did it compare to 2009? (PolitiFact has been doing these ratings only since 2007, so no other periods of review are available.)

Another thing. According to Adair of PolitiFact, the group has rated more than 7,000 statements. If an assessment is going to be made of one party’s truthfulness, why not examine allof the judgments, rather than a comparative handful? It’s fine if you don’t want to put in the work, but then don’t come out claiming that you have proven anything with an achingly limited review.

So, here’s what the C.M.P.A. study does prove: over a small period of time, one fact-checking organization that subjectively selects statements for review designated a larger portion of G.O.P. statements in a small sample to be false.

In other words, meh.

On the other hand, death panels? Benghazi cover-ups? Worse than Watergate? “You didn’t build that”? Yah, the G.O.P. is dishonest. Now someone needs to do a good study to determine which party lies more often.