A client recently asked me to find out users' top tasks on an intranet site for usability testing.

It is hard to interview the users, so I advised them to create a survey with some open-ended questions like: "what motivates you to use the intranet site?" or "can you tell us the most important tasks that you do on the intranet site?"

4 Answers
4

This isn't always easy and you will most likely have to adapt whatever practice you decide on to fit your specific situation. If you're unable to observe people, or you're experimenting to learn about how to capture the right information, surveys can help you gain an initial understanding.

The biggest problem with asking people why they do something is that they will find it hard to answer honestly - particularly in retrospect - and in an effort to be helpful, they will give you the answer they think you are looking for.

As you say, open-ended questions are more useful and can give you potential weak signals about issues you haven't considered. "What was your motivation..?" sounds slightly vague and overwritten. I would ask simply-worded questions, but with a degree of ambiguity and let them supply the detail: "What are you looking for?", "Why are you using the intranet?" and "What problem are you trying to solve?"

An alternative approach which might yield more material would be to choose several people and ask them to keep a diary for a day or week - depending on how many people you can recruit and how much material you need - and to have them note down whenever they use the intranet, what they were looking for, why and whether it was successful. You can then aggregate this information and design your tests.

Don't expect to get it right first time and use your experiences to refine your method as you go on.

Observing (even a couple of) users or having them walk you through their tasks is really valuable. You seem to be aware of it but it cannot be stressed enough. You wrote that “interviewing the user is hard” but what does hard mean in this context? It still might be worthwhile to push for some access. You will need to talk to real users for the usability tests anyway.

You could also talk to the people managing the intranet in the IT department, enquiring about complaints or requests for help they might have received (if users ask for help about finding a particular info, it means that this info is useful to someone and that it's not easily findable). People putting content on the Intranet might also have something relevant to say (HR, etc.). What info are they still asked about which they would prefer people to look up online? Besides, they are users too, so they might have some issues of their own to report.

Also, without any sophisticated analytics package, some basic statistics (which page is the most visited or that kind of things) or log file analysis might be available or worth collecting, even if only for a few days. However, even if you had very good analytics, this cannot be the only source of data. It's perfectly possible that some part of the site is not visited at all because everybody knows it's not working well and prefers to fulfill this particular task in another way (e.g. phoning or sending an email directly to the responsible person).

If you go for a survey, consider asking focused questions like “What function do you use on a regular basis?” or “When was the last time you turned to the intranet and could not find what you were looking for?” rather than broad why and what-motivates-you questions. When planning the recruitment and sample size, keep in mind that intranet are even more likely than e-commerce website to have starkly different user groups.

Another technical issue to consider: The number of people you need for your survey will depend on your objectives (identifying a list of tasks/issues requires a smaller number of respondents than prioritizing these tasks or quantifying their importance) but whatever you do with the data, keep in mind that the error margin is much lower for a subgroup than for the whole sample. To take just one example, if 10 people out of 100 tell you that task A is the most important, you can expect that between about 5 and 20% of all users actually think so. But if, out of these 100 respondents, only 20 of them match a particular user profile (say they work in a particular country or a specific department or are managers, etc.) and 2 of them mention task A, your best guess would be that between 3 and 30% of that subgroup really consider task A to be the most important. Your uncertainty about that particular figure is much higher for the small subgroup than it is for the whole population because you are in effect trying to reach conclusion from a much smaller sample.

Finally, why does your client wants “the users” to define which tasks are important? Are they possibly trying to respond to some complaints from the workforce (and if yes, which ones)? There are all sorts of information that can only be obtained from the end users (as opposed to management or IT) but which tasks are relevant for the business is not necessarily one of them. Surely, the client invested time and money in this intranet for some reason. This could give you a cue to the relative importance of different tasks.

Do you have the opportunity to go and sit with these people and observe how they re working with the site? If not then try installing heat and mouse tracking solutions. This will give you a good idea of at least the areas of the site that are working. Then put together a cognitive task survey with direct and some open ended questions towards the bottom. Don't make the survey more then say 10 questions. Also, you don't need to survey more then say 5 - 10 people. After that the answers will get repetitive and yield you no new observations. Take what the users are telling to what your seeing from the mouse tracking and heat maps and you should be able to come up with a fairly good answer. Analytics such as Googles are good but wont actually show you what the user is doing while on the page. Take all of this data and come up with a picture of how the site is being utilized.

On what is this advice about 5-10 people for a survey based on? It seems flatly wrong.
–
GalaDec 16 '12 at 9:12

Have you done a heuristic or cognitive task evaluation? Within a group of 5 - 10 if you qualified your subjects corrent and fully understood the target personas you were going after then yes. You can find out what you need from a group of those numbers. There are at time where you may want to run more people, say in different geographic locations. But overall you don't need more participants then that. If what your asking is well structured then you'll get the insight your looking for. Any more then that just adds to the clutter to sift through. I've done well over 40 of these btw.
–
TonyDec 16 '12 at 20:22

I am not sure what a “heuristic or cognitive task evaluation” would be (I know “task analysis” but that's not relevant here and “cognitive walkthroughs” or “heuristic evaluations” but they don't require access to any real user at all) but have you done any survey? More importantly, how would you know if 5-10 people are actually enough? Is it base on some statistical reasoning? Have you done any large sample one and compared the results?
–
GalaDec 17 '12 at 6:32

The 5-to-10 figure was spread for many years in respect to usability tests. It's been popular because it justifies the whole “guerilla usability” approach and makes it easier to get a project approved but it has also been thoroughly criticized and is partly wishful thinking.
–
GalaDec 17 '12 at 6:37

In fact, there is a rationale and empirical evidence for this in some old Nielsen publications but in any case here the question was about prioritizing tasks, not identifying a bunch of issues. No matter what type of data you consider (proportion who name a given task as most important, forced choice, importance ratings…), there is no way 5 people is enough to tell things apart with any kind of reliability except in the most extreme cases.
–
GalaDec 17 '12 at 6:41