PRESS MAN: The Survey Says...(Whatever You Want It to Say)

It is always illuminating to note the occasions when the famous skepticism of the hard-boiled American journalist (“If your mother says she loves you, check it out!”) up and leaves him. Environmental reporters, to cite a common instance, will take the Wilderness Society at its word when it releases a report predicting ecological calamity at the hands of grasping capitalists; when a think tank backed by an oil company puts its own scientists on the case, the reporter smells a rat. Ideology often accounts for the disparate treatment, of course. If your mother says she loves you—well, it all depends on what kind of gal your mom is.

Just as often, though, some other impulse overcomes the skepticism, a tendency built deep into journalism itself—maybe even into human nature. The suicide of Tyler Clementi on September 22, after his roommates at Rutgers University streamed live video on the Web of his sexual encounter with another (male) student, was a tragedy for his family and friends and a devastating embarrassment for his school. It was also a windfall for Campus Pride, an organization that trolls the nation’s colleges and universities for anti-homosexual activity. When Clementi’s suicide found its way into the news stream, Campus Pride was happy to help.

As luck would have it, earlier in the month, CP had issued a “landmark research study” called “The 2010 State of Higher Education for LGBT People” (LGBT is the unpronounceable acronym for Lesbian, Gay, Bisexual, and Transgendered, though in the text of the study, it is lengthened to LGBTQQ, reflecting the addition of the “Queer and the Questioning” and making it even more difficult to pronounce). The state of higher education for LGBTs was, the report concluded, not good. “The results point to significant harassment of LGBT students and a lack of safety and inclusiveness that exists on campuses across the country.”

At nearly 200 pages, the report had all the trappings of social science: charts, graphs, regression analyses, covariances. The press release boasted that the findings constituted data from a survey of “nearly 6,000” homosexual faculty and students from campuses in all 50 states (the actual number of participants was 5,100, but trust me: no one was counting). The report was respectfully received in outlets like the Washington Post and the Chronicle of Higher Education, and then, with Clementi’s suicide a few days later, it really took off.

One interview on NPR’s Morning Edition was typical. The chief author and researcher, an associate professor from Penn State named Susan Rankin, ran through the findings: “Contrary to popular belief, campuses have not become significantly safer for gay and lesbian students and faculty.” Indeed, some “data” indicated that the “climate” in higher education may even be getting worse.

The NPR interviewer noted Rankin’s academic affiliation as a way of establishing her bona fides. He neglected to mention the rather more important tidbit, that her report had been commissioned by an advocacy group whose continued existence depended in part on the wide acceptance of its findings; if it’s open season on campus gays, then they’re going to need a trade group. The usefulness of the report, from a journalistic point of view, lay in the conclusion that it encouraged NPR listeners to reach: Clementi’s awful death was something more than an awful death. It was a trend.

The conceptual difficulties journalists had to surmount in pushing this idea were formidable. Begin with the conclusion: America’s college campuses are inhospitable to gays and lesbians? That’s news all right: surprising and alarming—also manifestly untrue, as anyone who has strolled across a sun-dappled quad in the past decade can tell you. And the survey’s methodology was too rickety to support any kind of grand declaration. The survey questions were developed by social scientists in consultation with “members of the LGBTQQ community.” The sample of 5,100 respondents, while impressively large, was not randomly selected to provide statistical validity. Researchers instead used a technique called “snowballing.” They attended national conferences where LGBT faculty and students were likely to gather and asked them to take the survey. Those who agreed were asked to spread the word to other potential respondents by word of mouth and direct mail, and so on, with the snowball gathering participants as it rolled downhill. The methodology should have made it clear to any reporter who bothered to read the fine print that the report was commissioned and written by activists soliciting other activists to recruit still more activists for a survey designed to advance their activism. The report and the survey it was based on were, in scientific terminology, hooey.

There was an undeniable ideological element in the gullibility that led NPR—and the Washington Post and the Chronicle and PBS’s NewsHour and others—to treat a bogus piece of social science as news. But the ideology was merely reinforcing a much deeper impulse: the hunger for data, hard numbers, anything that can be called research, which will yield generalizations that can lift a straightforward news story onto the loftier plane of a cultural observation. By its nature, reporting is a small-bore enterprise, dealing in particulars; but contemporary journalism aches for an aerial view. The ranks of American journalism are filled with big-picture guys. And social science is happy to flatter their weakness.

Sometimes I wonder how our news media would function if journalists suddenly brought a skeptical eye to the reports and surveys and numbers that the nation’s social scientists never stop producing. Entire sections of USA Today would vaporize, NPR would fall silent for hours at a stretch, the morning shows would be reduced to weather and sports, and the newsmagazines would soon be going out of business if they weren’t already. The problem for journalists is that most social science isn’t particularly scientific. This deficiency, to the extent that journalists are even aware of it, makes the data no less tempting or addictive. I’ve lately taken to making frequent stops at a website and magazine called Miller-McCune, which offers up a steady supply of research published by sociologists, psychologists, professors of marketing and business, evolutionary psychologists, econometricians, anthropologists, and their colleagues in disciplines more dubious still. The site has become enormously popular among journalists because the stuff it provides is camera-ready, easy to package in sentences that begin with the stirring words “Research shows…”

As the election approached, Miller-McCune bristled with research about politics. Below the glistening surface, though, the data are almost always murky. Here’s an example from early October, chosen at random. “Experimental research shows,” one story announced, “when [voters] say they intend to do something [like vote], they are more likely to do that.” Thus if a campaign calls voters before an election and doesn’t just implore them to vote, but actively asks whether they plan to do so, voter turnout will increase by as much as 23 percent. A nifty little datum, perfect for a walk-up article before Election Day, and totally certified by science!

Except it isn’t certified by much of anything. The “experimental research” consisted of an exercise conducted by 13 Ohio State undergraduates who polled 60 of their classmates by phone shortly before the 1984 election. The results (and that 23 percent) have been cited ever since, even though later experiments with larger samples have failed to produce the same effect.

You’d be surprised how many truths of social science collapse in this way, in the attempt to extrude grandiose generalizations from evidence that can’t supply them. It’s a weakness that journalism shares with most social science, and this surely explains why the two trades have so easily become allies in the dissemination of hooey. Sometimes the behavior of 60 undergraduates in Ohio reflects nothing more than the behavior of 60 undergraduates in Ohio. And sometimes, as in the case of poor Clementi, an awful death is just an awful death.