Because I’m interested in karma points, I became a member of fivesecondtest.com. I clicked on Start Here and it shows:

Final instructions to the user before the test.

Ah ha! The test tricked me. I clicked on “Show Image” and was shown a screenshot in my browser window. But the screenshot was too big and I had to scroll back and forth to see it. I wasted my 5 seconds and blew the test. And then, the screenshot went away and I was asked,”what is the purpose of the website?” and given a text input area.

The prompt was: “Take a look at the website screenshot, and look at the design/colors/text.” The questions were:

What is the purpose of the website?

Do you like the color scheme?

Is it too cluttered?

Is there too much text on the left of the iPhone image?

Is the navigation bar easy to understand?

Yep, I didn’t read the instructions. If I had, I would have known to concentrate and look at for 5 seconds. Duh, it’s called fivesecondtest.com! Sometimes, I’m late to the party.

My immediate reaction was, “How can you ask these questions based on viewing the screenshot for 5 seconds?”. What questions would you ask after only giving the user 5 seconds to view the site and test? Can you get reliable data based on this type of test? How does the social aspect and the repetitive nature of the tests affect the results?

Now, I’m interested and I decided to do another.

The next test reiterated the 5 second rule in its prompt. Prompt: “You have 5 seconds to view the image. Afterwards, you’ll be asked a few short questions” This test asked:

What does this company do?

What did you think of the layout and design?

What stood out most to you?

If you were looking for a website, would you continue to go further into the site to find out more?

Does the website look professional?

Okay, I’m getting the hang of it now.

I do another one – this one the prompt is, “You are a young woman who regularly purchases beauty products (skin care/cosmetics)”. And, the questions:

How would you rate this page on a scale of 0 (really bad) to 10 (really great) ?

Was it clear what the product was?

What would you do next?

What did you like most on the page?

What did you like least on the page?

I saw my buddy on Facebook and we chatted about it. He said, “You get a lot of BS answers because people really don’t have incentive to answer thoughtfully.”

He also gave me an idea of how it works using the ‘community option’ service level of fivesecondtest.com. You get 30 responses for free for signing up. And then, for every test that you take, your test can be given. Paid plans start at $20 (100 test responses) a month and go to $200 (1000 responses) a month.

To summarize my opinions, I feel that this type of test could be a part of a good user testing plan. I was not impressed with the questions in the three tests that I took. Following the G.I.G.O ( garbage in garbage out ) principle, bad questions will yield bad answers. I doubt that the questions / answers will yield any actionable information beyond confirming pre-existing assumptions about the sites. However, I feel that a well-designed question that takes into account the nature of the test could produce actionable results. question should be general and open-ended. They should deal with impressions and feelings, rather than specifics.

Finally,taking my opinions into account, here are my top five questions from the 15 I saw:

I have a client with a new website that I’m in the process of finishing. They’re a small coffee shop and the owner has a real passion for making the perfect cup. He’s the type of guy who regularly makes trips to the farms that grow the beans he roasts. At any time, you can walk into his shop and he has 18 fresh roasted coffees available for purchase. As on-top of their game as they are with coffee, it’s also true that like most busy business owners whose day-to-day operations consume all of their time, their current website looks like a pre-Web 2.0 relic. It performs like one too. That’s the reason I’m building a new one. It’s also why I’m confident that my new design will raise sales immediately. There’s practically nowhere to go but up.

Be that as it may, I’d rather sell my clients facts, rather than confidence, so in the coming weeks, we’re going to be doing some user testing. Essentially, the website is going into its beta release.

The purpose of the user test is to get feedback about the new design to understand if there are any parts of the website that are confusing to users, and to test the site for hidden bugs that might prevent a user from getting the information they need or from making a purchase.

We will be getting people to test the website that are already familiar with the brand. Some of them will come from their Facebook page; the rest will be users who order coffee regularly from the current website.

At our disposal are a plethora of tools to record and measure how a user interacts with the website, including:

video taping the client (over the shoulder)

tracking mouse behavior

heatmaps

recording the user’s interaction with the website

interviewing the user

looking at web analytics

It’s also possible to have an eye-tracking study done but cost and time factors rule that out for this test.

So what tools should I use? I’m not interested in trying them all. I just want to use the tools that are appropriate for my needs and that provide me the feedback in the most actionable way possible.

Before I start to look at the tools though, I think it’s wise to list out the questions we want to answer and constraints that we have to work under for this user test:

Questions

Can users find the product they want to buy?

Are users confused by the choices in the main menu?

Can users buy the product easily? (examine the checkout process)

Did users easily find the information they expected in the places they expected (i.e. could they find out about returns, the privacy policy, security of the checkout process etc., when they needed it?)

Do users have enough information to make a buying decision on a coffee they have never purchased before?

Does every part of the website function as it is intended to? (this concerns the mechanics of the website)

Can the users find any bugs in the site?

Constraints

Two week time frame

Users limited to Facebook fans and current users

Want to do it as inexpensively as possible

From these questions and constraints we can make some immediate decisions:

Decisions

We don’t need a service like Ethnio which will find users to test the website.

We will want to do interviews with the users. The choices here are either to do a personal interview which can be recorded or to have the user fill out a questionnaire. I would prefer to have a video taped interview so we can look at the user’s body language and facial expressions. However, it’s possible that we will need to use a questionnaire instead due to time limitations. In either case, we would also like to be able to follow up with the users if we have additional questions.

We need to record how the user uses the website. This could take the form of videotaping how they use the website in the over-the-shoulder style. Since we can expect to have users comb the site while we are not physically present, capturing their screen data seems to be a better option. We should look at programs that record the user’s session and mouse behavior. It would also be helpful to be able to retrieve that data as visual information, via heatmap or some other similar visual display.

Because we will be testing a limited number of users, it is not important to track analytics. We should have the individual feedback from each user which is more granular data than analytics by itself. There might be a tool out there that combines analytics with user testing tools. If so, it’s worth looking at but realistically it’s not necessary.

That seems to about cover it. We’ve been able to distill our needs into two essential items:

Needs

An interview/questionnaire

Tools that record the user’s session on the website and provide quality reporting tools

Now that we finally know what we need we can look at what’s available and decide which tools best fit our needs. In the event that multiple tools can do the job, we’ll assess the pros and cons and then make a judgement call.

I’ve done some research already into what tools are available and they are all listed in the “Tools” section of links in the sidebar. I’m sure there are more tools than what I’ve listed but I’m also reasonably sure that we can find the right tools for the job from the websites listed. For the purposes of this article, and for our experiment, we’ll confine our set of possibilities to the 10 following choices:

Selection Set

We can use as many of them as necessary but since we’re also trying to be economical with our time and our money, ideally we’re looking for a one-stop shop that’s free. It’s unlikely we’ll find that. But hey, a guy can hope.

ClickTale offers a suite of tools that look ideal for our needs. They include:

visitor recordings

mouse move heatmaps

mouse click heatmaps

conversion funnels

attention heatmaps

the ability to watch a user’s activity in realtime

form analytics

campaign tracking

Unfortunately, all of that great tracking doesn’t come for free. They offer three plans, starting at $99/month and going up to $990/month. The fact that there’s a monthly fee indicates that this product is meant for ongoing user testing. While this is a consideration – we plan to do additional user tests in the future – it’s $290/month for the plan that gives us access to everything in the above list. It’s possible that this might be the right solution for us, but it’s likely that it will prove to be prohibitively expensive.

CrazyEgg is primarily a click-mapping tool. I’ve had extensive usage with this tool in the past. It’s great for what it does: track mouse-clicks and display the data in several easy-to-understand maps. It doesn’t record where the user moves their mouse. It doesn’t record a video of the entire session. And it costs money. Plans range from $19/month to $189/month with the $49/month plan making the most sense since it updates hourly instead of daily like the cheaper $19/month package. For our purposes, CrazyEgg might be too limited in its capabilities for us to use it, especially at $49/month. It’s likely at that price range that we will find other tools that offer more functionality. We shall see.

Feng-GUI, if it works as advertised, is a great idea and could be very helpful to web designers. The idea is that designers can upload an image of a project they want analyzed, be it a web page, a print ad, or other design. Feng-GUI “looks” at the image using an algorithm that mimics the human eye and human attention to generate a range of visual-attention maps. At $25 for 10 images, this is a dirt cheap way to get early feedback on web designs.

You could make the argument that we would benefit from submitting an image of the front page, the category page, the product description page, the cart, and the checkout pages to Feng-GUI to see what it spits back out. For $2.50 an image it’s not the kind of thing that needs a lot of thought. Just do it already.

I can’t help but think that this service is much more helpful at the beginning of the design cycle than it is at the user testing phase. But I’m biased. It’s my design. And I can’t bear the idea of the algorithm telling me that I need to redesign significant portions of the website. I want to know that before I code it all up so that users are picking nits instead of tearing the entire site to shreds.

Regardless of my feelings as a designer who is attached to his design, the fact remains that Feng-GUI provides a way to get approximate eye-tracking metrics on a design for a very cheap price. As I said at the beginning of this post, I wanted to leave eye-tracking out of this discussion because of the cost and time factors. But Feng-GUI looks like such a great tool that we’re going to have to give it a try.

At a minimum, we’ll have some interesting feedback that we can compare with the feedback we get from our users. It will be enlightening to see if it agrees with and can predict the users behavior and/or interview responses, or if it contradicts them. For the price of a cup of coffee, we can’t say no.

Inspectlet is a site that allows users the ability to record visitor sessions, display mouse-click heat maps, and tracks analytics in real time. In short, it does everything that we need to satisfy point #2 on our two-point list of needs. Its heatmap reporting isn’t as extensive as what ClickTale and CrazyEgg provide but for our purposes, because our test will be limited to 5-10 people, a heatmap of mouse movements is more of a luxury than a necessity. Unlike CrazyEgg, Inspectlet offers the additional features of being able to record the user’s session and access to real-time analytics.

Unlike ClickTale, Inspectlet is much more reasonably priced. Plans range from $7.99/month to $89.99/month.

KISSmetrics calls themselves “person-based analytics”. I wasn’t able to figure out exactly what that means by looking at their website. I believe it has something to do with showing Analytics to different personality types. For example, the web dev sees one thing, the sales guy another, and the stockroom sees a third. But it’s a little unclear.

KISSmetrics first popped up on my radar when WIRED did a story on them about how they track users in a way that they cannot delete. The hype was massively overblown but the company was forced to use a new method which users could disable.

What KISSmetrics really looks like they’re doing is that they’re tracking users all around the web to get a better understanding of just what exactly brought them to your website. In that sense, KISSmetrics looks like a great tool for ongoing analytics and testing. Prices start at $29/month.

Loop 11 almost owns this thing. There’s one big gaping assumption that I made when I got all sweet on Inspectlet a few minutes ago. I’m assuming that the user will know what they are supposed to do, or that they will be prompted by somebody to accomplish a task. This may be true. Whether that’s the case or not, the reality is that the user is going to have to be prompted to take specific actions on the site so we can measure how they behave.

Loop 11 provides a way to do that. Loop 11 is a tool that allows users to create a user test. They generate a link which can be sent to people in order to get them to participate in the test. This would be helpful to us because we could put the link out on the company’s Facebook page and get more than 5-10 users’ feedback on the site.

They have partnered with OpenHallway to allow their users to record the visitor’s session as well. This gives rich feedback in the form of Loop 11 reports and the individual video sessions.

The reason I said that Loop 11 almost owns the category is because there are a few features missing. The biggest one is the lack of heatmaps. One could argue that it’d be nice to have Inspectlet’s analytics but, as we’ve already covered, analytics isn’t necessary for what we’re doing. But the heatmaps, those are a necessity. In my experience with CrazyEgg, everybody, even the dumbest guy in the room understands heatmaps – without instruction. It’s such a powerful tool in that sense that it’s a must-have for us.

The massive downside is the cost. It’s $350 per test. Is it worth it? That remains to be seen. If we can conduct our own tests, then probably not.

Morae is a user testing solution similar to Loop 11 but it’s sold as a software package, has more features, and is more expensive. The entire package costs $1,495. It seems appropriate for a usability business or for a company that is dedicated to ongoing usability testing. For our purposes, it’s too expensive and it seems limited to testing users on specific machines. It’s not appropriate, unless I’m misunderstanding, for conducting a user test via the Internet.

SilverBack is like Morae in that it is tied to a specific computer. It’s much more reasonably priced at $69.95 and while it doesn’t offer as complete of a feature set as Morae, it does seem like a great tool for quick in-house user tests. A cool tool but it’s not right for what we’re doing here.

Usabilla looks really cool. It seems to be essentially a less expensive version of Loop 11. They are a website that allows you to create and conduct user tests. They have great reporting tools, including heatmaps, unlike Loop 11. But while Loop 11 has partnered with OpenHallway to offer video recording of user visits, Usabilla doesn’t offer this feature. However, after looking at OpenHallway’s website, I think it might be possible to feed the link that Usabilla generates into OpenHallway so that OpenHallway can record the user tests. Using OpenHallway will cost $19/month for their smallest package, which will work fine for our needs.

Usabilla’s price range goes from free all the way up to $139/month. Depending on how much data my client is comfortable sharing on this website, the free version could work just fine for us since it includes 1 test with 10 participants. The downside is that the results are public. The next tier up is $49/month, allows for 1 test with 50 participants and the results are private.

Finally, we have UserFly. UserFly is like OpenHallway in that it’s a simple tool that records how a user interacts with your website for late playback. Unlike OpenHallway which is priced based on hard drive space (their smallest plan allows for 90 minutes of video), UserFly is priced based on the number of captures. They offer 10 captures a month for free. Prices top out at $200/month for 10,000 captures. Who would ever look at 10,000 captures is anybody’s guess.

My gut feeling here is that OpenHallway is a better value. They also seem to play nice with Loop 11 which makes me think they can play nice with Usabilla. While UserFly is definitely worth trying – especially since we can do so for free – it really seems like a good topic for a blog post but it’s not quite what we need for this user test.

The Final Results

After all of that, it seems clear that Feng-GUI, Inspectlet, and Usabilla are the tools we should use for our user tests. With Feng-GUI we will get some great data that approximates an eye-tracking study. With Usabilla we have a tool where we can actually conduct a user test. We can write the instructions for the user and they can take the test without need for us to moderate it. It will provide us with great feedback based on the outcome of the user tests. We will use Inspectlet to record the user studies and will generate additional data and heatmaps that will show us more directly how the users behaved on the website.

The cost is reasonable too. For one month of testing it will cost:

$25 for 10 Feng-GUI tests

$7.99 for Inspectlet (the first week is free)

$49 for 100 tests on Usabilla (or 10 for free)

Total: $81.99

If we found ourselves on an extremely limited budget, we could manage to conduct 10 user tests for free. We would have to do without the Feng-GUI analysis and we’d have to allow the Usabilla reports to be made public but the upside is, it wouldn’t cost us a dime. However, in this particular case, $82 seems like a completely reasonable price to pay for a month of user testing.