The design principles behind the latest improvements to the ForeSee survey-taking experience

Share

Over the past year at ForeSee, we have been rapidly improving our product. Our constant updates to ForeSee CX Suite show we have been busy on the reporting side of our business. However, behind the scenes we have been busy reimagining our data collection — namely one of the most visible portions of our solution: the design of our surveys.

Our research and design teams know that any modifications to survey design can have a big impact. Even the smallest change can have unforeseen consequences, and the weight of our decisions was not to be taken lightly.

With that in mind, we set out to evolve our existing survey design, while maintaining the scientific approach to surveying that our customers expect. This involved ample research, design, user testing, and validation with our clients and their customers.

In the coming months, you’ll begin to see a new survey design format, deployed by the first set of ForeSee clients who are excited to be helping us further test these improvements. Here’s what’s coming up with the ForeSee survey design:

Shortening surveys without reducing questions

Our clients love our standard survey questionnaires because they’re backed by decades of academic research and based on a causal model that links customer experience improvements to business outcomes.

Their customers, however, don’t have that context. We often receive feedback about our surveys being too long. While our standard questionnaire is anywhere from 9-15 questions in length, custom questions added by clients can quickly inflate that number.

We started simple by removing unnecessary drop-downs, and improving form inputs across the survey. In order to add a sense of progression to the experience, we added a progress bar and rewarding micro-interactions. Automatically sending the user to the next question while sliding their screen right to left makes the experience feel more rewarding while maintaining a natural progression forward. Ultimately, we removed the numbers on questions completely.

Most importantly, we took the certain questions from our standard questionnaire and placed them in a matrix format. By adding framing text specific to each element, and reducing the length of individual element questions, we succeeded in making the element questions much easier to answer together, rather than individually.

All of these changes led to users perceiving the survey to be much shorter than it actually was. We managed to shorten our surveys without having to reduce the number of questions needed for accurate and reliable insights.

Multi-page survey design & matrix format.

We tested how survey-takers interact with a single-page survey vs. a multi-page survey design.

In our initial tests, completion rates look promising. Just as many or more survey-takers are completing surveys, while taking more time to answer questions thoughtfully, and submitting more open-ended questions.

Seeking to improve our collection is great, but it is equally important that every customer has an opportunity to voice their opinion, regardless of their disabilities. With that in mind, we designed the new multi-page survey to also be 508/WCAG compliant for accessibility standards.

Our new mobile survey design.

Surveys as an extension of client branding

To our clients, ForeSee branding is important and synonymous with certainty. But to our client’s customers, ForeSee branding is commonly described as “that survey that pops up on every website.” (Hey, we’ll take the wide usage of our digital survey as a compliment!)

We wanted to give our clients the option of a survey that serves as an extension of their brands, which acts as the beginning to an open dialogue with their customers. Our branding is still there, but is much less prominent than before.

Now clients can promote their branding across the entire survey experience — including the invitation, survey, thank you, and email. Besides their logos, they can add custom images and select their brand colors across the survey. Ensuring the whole experience is customizable to fit the brand of our customers ultimately leads to a better experience for the end user.

Improving the post-submission experience and closing the loop

During testing, we found that most users fill out a survey for two distinct reasons: incentives or a polarizing — very positive or negative — experience. We wanted to ensure that those who had a negative experience felt more satisfied after sharing their feedback.

Our new Thank You pages.

Daniel Kahneman and Barbara L. Fredrickson popularized a concept called peak-end rule in a 1993 paper: “When More Pain Is Preferred to Less: Adding a Better End.” It is a psychological heuristic which explains how people recall experiences. They found that people judge their experiences, regardless of length, based on its peak (the best or worst part) and at the end of the experience, rather than the average of all moments. This effect is observed in both pleasant and unpleasant experiences. Wanting to capitalize on the peak-end rule, we set out to make sure every completed survey had a satisfying completion.

For example, we can help companies turn negative experiences into more positive experiences with context-driven “Thank You’s.” The ForeSee Case Management application can be used to open tickets and alert companies when a direct response to a survey-taker is needed. If a negative experience requires extra care, like a coupon code, we make that readily available in customizable, branded templates. If a case needs to be escalated based on a survey-taker’s feedback, we can do that too.

These capabilities allow our clients to close the loop with customers, and possibly reverse a negative experience, improve satisfaction, reduce churn, and keep customers coming back.

Based on a foundation of survey testing and iteration

Our clients value working with us for the rigor and reliability of our survey questionnaires, and a proven methodology that’s peer-reviewed and rooted in academia. Thus, with any modifications to the survey, we take the utmost care in design, research, and continuous testing and monitoring of our changes.

We’re always carefully tracking the below to ensure that any modifications to our survey design have a positive or neutral impact on:

Accepted, presented, and completion rates

Average number of questions before abandonment

Completion time

Open-end response rates

Scores variability between existing survey and new survey

We’re not done improving the survey experience for clients and customers

This is just the current iteration in our process to build tomorrow’s survey tool. Stay tuned for follow ups in this series on our ongoing quest to deliver a better survey-taking experience for clients and customers.

Make sure to visit the ForeSee Developer Portal — Everything you need to set up and optimize your ForeSee product, including developer documentation, implementation guides, and question/answer forums.