Create Surveys That Generate High-Quality Results

Associations Now
November/December 2015By: Kristin Clarke, CAE

No question about it—associations are conducting more surveys than ever as they hunger to know everything about their stakeholders. New technology makes reaching them easier, faster, and sometimes cheaper. But are surveys generating the high-quality data needed to make good decisions? Check off these trends to boost your survey savvy.

If only associations could read members' minds. Lacking that skill, they're forced to ask questions—and a lot of them. Do members want a monthly magazine or a digital digest? A big-name conference keynoter or a town hall with peers? And then there's the mother lode: Would they recommend this association/meeting/product to a colleague?

Unfortunately, responses are not required. Indeed, members will likely ignore more survey questions than they'll answer.

But that doesn't mean they don't care, according to Douglas Meyer, a marketing and communications specialist with Bernuth & Williamson/The Opinion Collective. "We tend to believe people care as much about our issues and operations as we do, and we take it personally when they don't respond," Meyer says. "However, they may not be responding because we're not catching them at the right time, or we're not asking them things that are relevant to them in that moment. But people do want the organizations they belong to to succeed."

Also, low response doesn't necessarily mean low value. You can still end up with a large data set or a good-quality representative sample, say survey experts.

When Kent Agramonte, marketing and research manager at Naylor Association Solutions, conducted a readership and advertising survey with four partnering associations, responses for the Kentucky Association of School Administrators (KASA) and its Florida equivalent reached the desired 10 percent response threshold. But the survey garnered only 5 percent response rates for the Pennsylvania and Maryland/Washington, DC, education associations.

"I don't worry too much about that," Agramonte says. "We wanted to see how much members were spending on products and services. Even with a 5 percent response, I can still say that members told us they spend at least $100 million a year. That's a fact, even though it's not a picture of the whole organization. And we asked what they purchased last year, and that's helpful, too."

The responses also revealed two unexpected audiences reading the associations' publications—school counselors and curriculum managers—which presented new marketing and education avenues.

Wanda Darland, communication and marketing coordinator for KASA, wasn't worried about the response rate, either. She credits the survey with helping staff plan publication themes for the upcoming year, and it confirmed the association had correctly anticipated topics that members wanted for new education sessions.

But low response rates obviously aren't ideal. According to Meyer, "surveys have become a victim of their own success. It's now so easy for anyone to create a survey and to ask question after question that people simply become inundated with survey requests, and they're often really long."

Four Trends to Tally

To combat survey fatigue, associations need to recalibrate their strategies to account for new trends in the field. Any of these four may rev up response rates.

1. Start backward. Organizations
often mistakenly begin by talking tactics (number of questions, for
example, or type of distribution vehicle). Instead, Meyer encourages
them to stop and ask themselves what they really want to know. That
often leads to fewer but more targeted "higher-quality questions that
give you more actionable insights from the right people," he says.

2. Keep mobile in mind. Pew Research Center reports
that 64 percent of American adults own smartphones, so, unsurprisingly,
mobile surveys are "the big, big thing," says SurveyMonkey Survey
Research Scientist Sarah Cho. More than one-third of all surveys
conducted on her company's popular platform and 35 to 40 percent for
Meyer's firm are completed on mobile devices. Cho expects those
percentages to climb to 50 percent within two years.

Such rapid growth requires new thinking. First, a survey that looks
great on a desktop can look radically different on a tiny smartphone
screen, so careful design and testing are essential, Cho warns.

Second, connection speeds vary widely. WiFi provides a faster
connection than a cellular network, so a 10-question survey could take
minutes on one and "forever" on the other. To encourage smartphone
participation, Cho says a survey should include no more than 20
questions, with multiple questions displaying on a page. The latter
allows for quicker completion, since mobile participants won't have to
reload pages after each answer.

Third, generational and socioeconomic differences between mobile and
nonmobile respondents could bias survey results, so organizations should
take extra care in analyzing the data.

3. Consider mixed-mode surveying. Researchers are
testing new combinations of survey approaches to increase response
rates, according to Mollyann Brodie, Ph.D., president of the American
Association for Public Opinion Research.

"You might want to get some people to respond on a mobile device,
while others respond on the web, and some answer on the phone," she
says. Brodie adds that while many organizations use different tactics to
drive participants to an online survey, far fewer execute different
tactics for the same survey to leverage different response preferences.

The National Association of College Auxiliary Services (NACAS) is one
group that does. Its increased surveying of the past two years led
Chief Marketing and Information Officer Caleb Welty to add a focus group
series directed at different membership segments. His goal was
specific: "Get more personal, hands-on feedback on our media and digital
products."

Conducted at NACAS's national and regional conferences, five focus
groups are being moderated by Naylor Association Services staff to
reduce bias. They are intentionally distributed to reveal market changes
and to track effects of a marketing plan that rolled out for the
organization's new video-based professional development project, NACAS
TV.

4. Realize survey timing is critical. "Seriously
consider when your audience will be thinking about you and your issues,"
says Meyer. "If you're asking about member benefits, you might survey
people when they're renewing or at your conference."

At the 2014 League of American Bicyclists' Annual Summit, for
instance, Meyer's team surveyed attendees on advocacy messaging near
registration by handing them iPads. Only one person declined.

Polling Payoffs

Despite challenges, surveys can offer a glimpse into stakeholders'
minds, achieving everything from testing interest in a new product to
identifying needed workplace skills.

The Society for Human Resource Management, for instance, wanted to
know how much time members spent on compliance in targeted geographic
areas where state regulations are particularly complex. By partnering
with Marketing General Incorporated (MGI) for an online survey, the
organization learned that members spend disproportionate time on
federal- and state-compliance tasks that could be handled faster and
more effectively if they used SHRM resources. With this finding in hand,
SHRM began testing revised messaging around a new membership
acquisition and renewals package and found that it beat a control
package.

In another example, the Council for Advancement and Support of
Education partnered with MGI to test new membership models it feared
members wouldn't like. A multiple-approach survey found the opposite,
with members welcoming changes once the rationale and driving forces
were explained. Relieved, the association accelerated the model launch
date.

Whatever the survey trend, experts emphasize that one requirement
hasn't changed. "We need to make sure the respondent has a good
experience," Cho says. "Otherwise, without the respondent, you don't
have a survey. That's the trickiest thing in survey research today."

Trust and Transparency

Associations not only conduct their own surveys, but they also often use surveys by other organizations and polling firms to inform their decisions, especially related to advocacy issues. The question is, can you trust them?

The American Association for Public Opinion Research (AAPOR) launched a program in October 2014 called the Transparency Initiative (TI) to help people make good judgments about the quality of data coming from those polls.

"We have 50 organizations who are members of TI, and on our website you can see elements they've agreed to disclose, such as how they chose respondents, who conducted and paid for the poll, and how it was conducted," says Mollyann Brodie, Ph.D., president of AAPOR. Nonprofit charter members include AARP, the Urban Institute, and various foundations and universities.

AAPOR also offers a free online polling course through News U at the Poynter Institute to help the public understand polls and ways to interpret them. Next on the schedule: refreshing the course as the 2016 election cycle heats up.

[This article was originally published in the Associations Now print edition, titled "Surveys That Score."]

Our website uses cookies to deliver safer, faster, and more customized site experiences. The three types of cookies we use are strictly necessary, analytics and performance, and advertising. Please accept the use of cookies or review our cookie policy and manage your cookie settings.