Feedback From Survey Respondents: Rejecting Repetition

The success or failure of the insights industry is entirely dependent on the quality of our relationships with the people who complete our surveys. At Maru/Matchbox, we are committed to treating our survey respondents like people—people willing to share their time, opinions and experiences with us.

We
recently conducted video interviews with members of our Maru Voice UK community, and asked them what they like and dislike
about doing surveys. We also solicited suggestions for improving the survey
experience. The results reveal a need to radically change how the industry
conducts research and how it treats respondents.

Curiosity and Contributing

Many people complete surveys
because they want to make their voice heard. “I like sharing my opinions” one
woman told us. They feel that by sharing their perspective they can make a
difference. Asked why he did surveys. one gentleman said, “most of it is
because I think I’m making a difference in what we are doing….” Another woman
said her reasons for doing surveys were twofold: “One, it’s because if
companies, manufacturers and stuff don’t know what the customer or the people
know or want or think about their products, how are they going to improve them,
or change them, or make them better, or if they need to change them at all?
Second reason I do it is because 9 times out of 10 I just find it quite
interesting….”

Receiving honorariums is also a
reason to do surveys, but as we have seen from other research
on survey incentives it is a relatively
weak motivator that is more of a symbolic gesture of goodwill and respect than
a payment for time spent. These UK findings are consistent with what we have
seen in the U.S. and Canada.

Disliking Disqualifications

One theme that popped up again
and again is the dislike of doing long screening surveys only to be told you do
not qualify. We know that, in a river sample environment, screening questions are
typically repeated over and over before a person qualifies for a survey—if they
ever do. It’s an excruciating experience that I lived to write about in Bad Respondent Experiences are Killing the Insights
Industry.

At Maru/Matchbox, the people who do our surveys are spared this frustrating and exploitative experience. Our survey community members are richly-profiled, and we use that knowledge to target the right people with the right survey. We don’t need to ask things like demographics because we already have that information and can simply import it—which we have proven to provide accurate information.

Long
Surveys and Frustrating Questions

Other frustrations people
expressed included being asked questions that did not apply, and long, overly
detailed surveys. As one woman explained “They ask me, so what kind of products
do you use? What shops have you used before? And it goes on from like Tesco,
Lidl, Amazon, H&M, Argos, whatever. It’s like endless lists that I’ve been
to. I’ve shopped everywhere. Like, who hasn’t? Who hasn’t bought one thing in
Argos, and then hasn’t been back for 2 years?” Here the question is both too
long and not applicable. How accurate can information about a single store
visit from 2 years ago be?

The frustration with
disqualifications, the river sample experience, and long, repetitive and overly
detailed surveys that we see in the UK, have been echoed across the pond in the
U.S. and Canada.

Annie
Pettit has written a valuable and very readable book about writing surveys. The
title of the book People Aren’t Robots: A
practical guide to the psychology and technique of questionnaire design,
speaks volumes. In an email exchange for my book The
Insights Revolution: Questioning Everything,
I asked her thoughts on the way surveys are written today. She marveled at how
we can expect things of respondents that we can’t do ourselves.

“We forget
what we’ve bought, we misremember where we’ve shopped, and we rationalize that
we buy things because we need them, not because we want them. But when it comes
time to communicate with people participating in our research, we instantly
block out our own illogical behaviors and expect them to remember precise SKUs,
stores, dates and times, and so many other meticulous details about every
purchase decision they’ve ever made. That is the definition of illogical.”

“We expect people to not get
tired, bored, forgetful, or distracted when participating in research about
topics that are only interesting to the brand manager,” Pettit continued. “We
expect people answering questionnaires to choose valid answers when none of the
answer options apply to them and provide diverse responses to grid questions
that demand straight-lining. And rather than getting annoyed at our own failure
to plan for and design research that accounts for the illogical creature that
is the human being, we instead call people liars, cheaters, and fraudsters.
Market researchers are supposed to understand and be empathetic towards the
illogical, forgetful, perfectly imperfect human being.”

I think the people we
interviewed would say “Amen” to that.

Reduce
Repetition in The Survey Experience

We asked people for suggestions
for ways to improve surveys. There was a consensus that people want surveys to
be more interesting, and less boring and repetitive. And they have a point. Too
often researchers ask the same question from multiple angles, fearful that if
they might miss something. This is a terrible experience for the respondent,
who respond to the repetition by becoming less engaged, decreasing the accuracy
of the data.

What’s more, this repetition is
unhelpful in many ways when it comes to analysis. First, it gives us no new
information. We get basically the same answer to basically the same question,
over and over again—which helps give research the reputation of being boring.
It also focuses researchers on the task of
reporting, at the expense of time spent thinking about what the research means.

Let’s consider this example, from a test of a potential
television show. People saw the show and then were asked a series of questions,
including these:

Do you think this show is excellent, very good, good,
fair, poor?

How much do you want to watch more episodes of this
show?

Would you recommend this show to a friend?

How entertaining do you think this show is?

Would you want to meet the characters on this show?

Do the stories in this show keep you interested in
seeing what happens next?

Do you like what this show is about?

If you were the boss of a TV channel would you put this
show on your channel?

The answer to all these questions is essentially the same
because they are measuring one underlying thing: whether people like the show
or not. A factor analysis confirmed what a sensible reading of the questions
suggests. You don’t need to ask the same question over and over again, in
slightly different ways. The only question that is needed is the first one. The
rest are just noise that alienates the people who answer our surveys. All the
other questions are what Annie might call “only interesting to the brand
manager.” But the brand manager will be disappointed anyways because all the
answers don’t reveal anything more than people like or dislike the show.

Let’s stop playing overly safe by asking the same question
over and over. Let’s craft a few good questions and then trust the answer. Make
this your mantra: when in doubt, cut it out.

We need to heed the voice of the
people who do our surveys. We need to avoid using river sampling and use well-profiled
community members instead. We need to tell survey respondents how much their
opinion is valued and how it will help us make decisions that will serve them
better. And we need to avoid repetitive questions that alienate those respondents
while giving us no additional information.

We must heed the voice of the
people because without them the insights industry is over.

Ad blocker has been detected. To continue, please white list our website in the settings to view all of our content.

This website uses cookies to provide you with the best browsing experience.

Find out more or adjust your settings.

Privacy Overview

This website uses cookies so that we can provide you with the best user experience possible. Cookie information is stored in your browser and performs functions such as recognising you when you return to our website and helping our team to understand which sections of the website you find most interesting and useful.

You can adjust all of your cookie settings by navigating the tabs on the left hand side.