Thinking Differently with Your Current Toolkit

Recent innovation in survey research tends to focus on applying exciting and new technologies to gain more insights from heretofore untapped data sources.

Think non-conscious measurement (NCM) of emotions or mining user-generated social media posts. NCM offers interesting and engaging approaches to add new diagnostics for understanding what respondents cannot articulate; social media offers a breadth of free or inexpensive data for the eager marketer to harvest.

As exciting as these—and many other—emerging methodologies are, researchers do not need to look only to the flashy new tools for innovation. Although it’s trite to say, innovation can emerge from anywhere, even in the way traditional online methodologies are applied differently to learn new things. I recently used a “point and click” approach, which is nothing new, to gain unique understanding on stimuli within a typical online survey.

The main survey itself evaluated a set of stimuli that featured many components, including photographs, images, descriptions of products, and prices. We captured the usual suspects of metrics, including future purchase intent and overall liking, as well as other measures specifically related to shopping behavior. However, we also needed to understand granular detail on the concept’s specific elements. What specifically did the shopper like? What didn’t she like? We knew traditional open-ends (where we ask those questions outright) would not yield the distinct focus on the specific elements, so we searched for existing approaches that would generate the detailed understanding that the client needed. After investigating several techniques, we used an approach that had been part of our arsenal for many years—the “point and click,” where we show an image and instruct the respondent to click on the items he/she liked.

Traditionally, this methodology produces a variety of measures, such as a driver’s analysis, that can be linked to stated measures (e.g., the photograph of the woman using the product is more likely to drive purchase than the product alone). However, we wanted more from the data and that’s where we got creative —in how we analyzed these data.

First, we used the clicks to create a heat map of sorts on the concept. This provided a visual representation of the areas of the concept that were most well liked.

Then, we overlaid percentages on the images so quick comparisons could be made across designs. In addition, we could apply bubbles, sized according to the number of clicks the area received, to provide a clear visual representation (and comparison point) across images.

We didn’t limit the time a respondent could stay on any one page, which allowed us to determine “dwell” time, a surrogate we used for how engaging the concept was.

We also tracked the order in which the respondent clicked items, which allowed us to determine which items drew interest first, second, third, etc.

Arguably, the most innovative piece was our ability to compare these click percentages with other captured measures (similar to the driver’s analysis) and to explore correlations. Adding a few unarticulated variables, like dwell time and order of response to the analysis, made it more comprehensive and valuable as we provided a final recommendation on the effectiveness of the stimuli elements.

Bottom line: Don’t limit yourself to typical uses of traditional methods. Learn to value each tool at your disposal and think creatively about the ways you can approach business questions to learn incrementally. Remember, innovative approaches can emerge from anywhere, even from tried and true sources.

Having dedicated himself to Research for almost 20 years, Burke, Inc.’s DeWayne Ray investigates consumer behavior through ever-evolving approaches to help clients remain connected to and engaged with their customers.