Strategies to Improve Your Surveys

With new web-based survey and polling platforms launching every week and out-of-the-box event apps offering survey functions, it’s never been easier to reach out to attendees for feedback. But getting them to answer your questions thoughtfully is another matter. The way surveys are designed is essential to tease out actionable feedback from respondents.

“Survey creators have a tendency to throw everything but the kitchen sink into their surveys,” Sheila Grady, marketing programs manager for SurveyMonkey, said. “When you have a captive audience, it’s very tempting to add every possible question you can think of into a survey. And if your survey has multiple stakeholders involved, this only becomes worse.”

PURPOSE

Grady stresses that the overall design of a survey is just as important as the individual questions you ask. Communicating the survey’s purpose to respondents is essential. “If you’re going to take the time to send out a survey, make sure your respondents know that you’re taking their feedback seriously,” Grady said. “For example, here at SurveyMonkey, we run an annual employee-benefits survey to gauge how our employees feel about our existing plans and policies. Our VP of human resources then shares the results with the team and explains how our team’s feedback actually influences company policy.”

QUESTIONS

Another crucial element is the balance of question types. Yes-and-no and open-ended questions should be used sparingly. Short questions designed to capture respondents’ attitudes lead to the most useful, actionable feedback. “If you take a look at any of our templates,” Grady said, “you’ll see they mostly rely on Likert-scale question types [which measure how strongly respondents agree or disagree with statements] to help provide an accurate gauge about the range of opinions in a question, and one or two open-ended questions so respondents can share their feedback.”

LENGTH

Finally, shorter surveys tend to elicit more considered responses. In 2011, SurveyMonkey calculated the median completion time from a pool of roughly 100,000 surveys that were one to 30 questions long. The study found that respondents took just over a minute to read the survey introduction and answer the first question, and spent about five minutes in total on a 10-question survey. They also spent more time per question on shorter surveys — nearly twice the amount of time — than on surveys with more than 30 questions.

According to SurveyMonkey’s study, when your respondents begin “satisficing” — or racing through a survey — the quality and reliability of your data can suffer. The mandate to limit the number of questions asked should force pollsters to focus on the ones that are most directly actionable. “If you’re on the fence about whether you’ll use the results in a meaningful way or not,” Grady said, “just cut it.”