How many listeners donate? One in 12 or one in three?

One ratio looks at who's listening now and the other at the weekly cume. Advocates say each estimate has its uses.

Published in Current, August 10, 1998

Which is it? Is the conventional wisdom correct — that one out of every 10 or 12 public radio listeners is a station member? Or is it the more encouraging one-in-three, as found by the Audience 98 research project?The seemingly conflicting estimates flew past each other at last month’s Public Radio Development/Marketing Conference in Washington, D.C., without much elucidation. Now comes an attempt at elucidation.

The leading proponents of the 1:12 ratio, Oregon-based fundraising consultants Lewis-Kennedy Associates, reported at the conference that an average of 8.3 percent of stations’ weekly cume listeners can be found as donors in the membership files.

The challengers, associated with David Giovannoni and the ongoing Audience 98 research project, repeated their recent good news about the 1:3 ratio. Giovannoni reported during the spring [Giovannoni’s article] that about 33 percent of the people listening at any given time identify themselves as members.

One of the worst things about 1:12 is that it gives a false picture of the listening audience, says John Sutton, a Maryland-based fundraising consultant who works with Audience 98. The fact is that “core” (frequent) listeners to a station, who tend to be its members, tune in so much that they boost the number of donors listening at any given time. This info is captured by Arbitron’s AQH but not by the weekly cume, which doesn’t reflect listening frequency.

This helps Sutton know who he’s talking to. “When I open the microphone, I know that one in three listeners are already givers.” Another third are fringe (infrequent) listeners who seldom join, and the final third are the real prospects.

This can guide a station’s pledge strategy. Members are easy to reach with quick blitzes since they listen more, but non-givers need longer, less intensive efforts that won’t turn off the givers who listen, Giovannoni advised in May.

“If it’s really one in 12,” says Sutton, “then our on-air fundraising efforts have been really dreadful at getting new members. The implication is that we have to do more on-air fundraising and direct-mail acquisition. It ends up in creating a misguided effort.”

Stronger programming is the way to convert nongivers to givers, he contends, both by improving service to the ones who listen often and by wooing seldom-listeners into the cume.

Not incidentally, Sutton is advocating shorter, more intensive pledge drives, while Jim Lewis of Lewis-Kennedy is skeptical that they’ll reliably bring in new members.

Both ratios inherit quirks from their data sources. The 1:3 ratio is based on listeners’ own notoriously unreliable account of whether they’re members or not, pointed out Barbara Appleby, v.p. of the Development Exchange, during the conference. But Giovannoni says he adjusted the data from the Audience 98 resurvey of Arbitron diarykeepers to compensate for that bias.

The 1:12 ratio also has problems. It pairs a weekly number (Arbitron cumes) with an annual number (membership records), which Giovannoni says is “like taking the price of milk and dividing it by the price of eggs.” The weekly count also downplays the donors by leaving out many listeners who tune in less than once a week but may be members, Sutton points out.

Lewis doesn’t argue against the 1:3 view, but says the 1:12 ratio is still useful as a benchmark for stations to see how they’re doing compared with other stations. Strong stations do as well as 1:6, while one of the weaker in the recent survey did 1:20. “The only use we’ve ever claimed for it is: what is your relative performance?”

If you’re looking at a station’s cume, the 1:12 average gives you a benchmark to figure how big its membership would be if the station performed at an average level.

Lewis and Sutton agree that the benchmark would be improved if it compared the station’s cume of core listeners to its membership rolls–and Lewis plans to try that for an October report on Lewis-Kennedy’s CPB-funded Target Analysis project.

Audience 98’s database, in comparison, can’t help in evaluating membership levels for individual stations because it can’t get numbers on those from the Arbitron sample that was resurveyed, Sutton acknowledges.

The range of ratios averaging 1:12 do indeed provide a performance benchmark, Sutton says. “But is it telling you that your membership activities are good, or that your program schedule ought to be strengthened?”