sxswi: stop listening to your customers

Brief

A common assumption among startup entrepreneurs is that listening to potential customers is the best way to find out whether your product or idea will succeed in the market. Honestly — don’t bother. In our ten years of user experience research for startups and big companies alike, one thing we’ve seen time and again is that it’s behavior, not opinions, that tells you whether people want to use your product. The main problem with opinions is self-reporting bias: Opinions are often inconsistent with behaviors or other attitudes, especially when discussing hypotheticals. Remember Clippy, the little character that appeared in Microsoft Word years ago? That little bastard arose, in part, from Microsoft asking users if they wanted help working on their documents — everyone said, “Sure, sounds great.” But once people started actually using it in the real world, they hated it — it might be one of the most hated features in the history of computing. But Microsoft employs hundreds of researchers. So where did they go wrong, and how can you avoid making the same mistake? It’s simple. Never ask people what they think of your product or idea. Instead, I’ll walk you through the world of researching people, including what you need to ignore customers effectively, just like Apple and 37 Signals. I’ll go over examples from our research with Volkswagen, Electronic Arts, and Wikipedia, and show how to use remote research to construct behavioral scenarios and eliminate poor research.

“If you were working on a document, and you got stuck, a friendly character popped up to offer you help, would that be good?” Focus group: “Sure!” (Hypothetical Questions, A False Premise). Death to the focus groups?

Investigate current behavior

Totally different than monitoring system metrics. Watch what people do, plan changes to the interface around that. Focus: reduce friction. Twitter: This is how the retweet and hash searching features were improved.

Make sure observation is contextual (watch people when they are hungry). Double Surveys: ask open-ended questions, build your multiple-choice survey off of the answers. People lie less at home (native environments) and there are so many remote testing tools. Pick and choose your tools and don’t use them by themselves.

Sprint to testing

No false premises: Get to a functional prototype (something you can test) as fast as you can (mocks, paper, index cards, static html). Use real data, no sample data. Iterate (test, change, test, change, test, change). Don’t worry about building prototypes that contain all the solved problems, focus on the functions you need to be informed about / solve.

Case: Sifteo

Emphasize the moment

Focus on what’s happening in the moment. Put functions relevant to the content in that moment next to that content (relationships between content?). Don’t make me leave and come back (side-by-side).

Task analysis in four simple steps

Define the audience and their goals.

Create tasks that address these goals.

Get the right people (in the room to test).

Watch them try to perform the tasks.

You see what’s broken immediately. Ties you to the people you are building for. Early tests with people you know (safe testing). Later tests with people you don’t know.

Each team member sees the user’s desktop view of the tool (as it is being used) overlaid with video of the user’s face (Figure 3).

Figure 3

Good Research

Mix tools. Stream it live.

Share findings in a way that is welcoming to your designers and developers. Mix tools (GoToMeeting REO, heatmap (self-reported “why did you click there?”, usertesting.com, usabilla, ethnio). Get to functional observation as soon as possible. Stream it live (ustream).

When do you get to ignore customers completely? When you involve the 7 deadly sins.

Myth! Avoid the myth that geniuses have genius ideas that turn into genius products.
Truth! Great ideas come from other great ideas. Research contributes. Imaginative research facilitates invention. (when to use text on the screen: summing up, the take-away ideas, the result).

Metrics

Metrics don’t tell you what to do, just tell you if it works. Observation tells you what do to.