Observing, reflecting, designing. making better public services.

September 2009

The best way to a good end user experience is a program of iterative testing and design. There’s been quite a lot of design but not as much iteration as we’d have liked, so now is a good time to really kick off some testing of the D7 interface as it comes into existence so that we can potentially iterate it a little before it is set loose on the unsuspecting public.

In order to take best advantage of the enthusiasm within the community to participate in testing we should ensure that we are taking a generally consistent approach to the test process so that we can better compare our results and make decisions more easily. A way that anyone who wants to can conduct some tests and provide feedback, insight and recommendations, no matter where they are in the world, in a way that we can actually act upon.

Here are some initial thoughts on a framework for testing the D7 interface that can be adopted by anyone within or outside of the community and around the world.

Making use of previous usability test framework:

The Drupal Usability group conducted formal usability testing in Baltimore in 2009 (ref: http://drupal.org/usability-test-university-baltimore-community-solutions) which has informed much of the UX work that has been undertaken over recent months. It makes sense to capitalise on this previous effort as far as possible, not only to reduce workload at this point but also because it will provide a useful benchmark. There are some areas where it is appropriate to make some amendments to the approach taken in the Baltimore testing, these will be discussed below.

Target Audience/Recruitment:

As with the Baltimore approach, our testing should target two specific user groups who are important to achieving increased ‘reach’ for Drupal, and who have been at the heart of the D7UX design effort. Baltimore identified these groups as ‘Beginner’ and ‘Intermediate’. This assumes that every end user is at some point on the same Drupal learning curve, which our research has shown not to be the case – this is demonstrated in the example users identified in ‘Who Is D7UX for’ (http://www.d7ux.org/who-is-d7ux-for/), where the users Verity and Jeremy are described.

Verity would equate most closely to the ‘Beginner’ audience used in the Baltimore testing, however the description of ‘Beginners’ it was specified that these people have some development experience, and the script implies that their role may involved building a website using Drupal. Verity, on the other hand, is the recipient end user of a Drupal site who is then tasks with maintaining the content on that website. This is a very typical end user role for Drupal and one which we hope that Drupal 7 will do a much better job of supporting.

Jeremy, however is quite a good fit for the Intermediate audience group recruited in Baltimore, although again, development skills should not be required in order for him to successfully complete the tasks in his test script.

I propose that went we are recruiting people to participate in our user research we compare them to the Verity & Jeremy sketches and if they are similar, they are good matches for testing. Depending on which they match, we will then use a discussion guide and set of tasks tailored to their role and objectives.

Discussion Guide/Script:

The scenario presented in the Baltimore testing was essentially to build and/or manage content on ‘a website for a conference where users will post information about the conference like the schedule, speaker bios and local bars’. This is a great scenario and is suitable for both our Verity and our Jeremy participants and I suggest that we re-use this scenario, the content materials prepared to support the scenario and used in the test, and as far as possible, the script that was devised for the previous round of testing.

The existing beginner script is, I think, almost entirely suitable for the Verity role, and comprises the following tasks:

publish some information about the conference (to existing content types/sections of the site) taking content from a text file provided.

create a navigation link (if we leave anything out of this script, I think this might be it… not sure if this is a real Verity task… could be tho’)

change the look of your site, remove the logo at the top left

creating a new role (speaker) with additional permissions (suggest that perhaps rather than creating a new role, assigning users to a the speaker role might be more appropriate for Verity)

create a new page (schedule) and navigation to that page

The existing Intermediate script is also very suitable for Jeremy, and comprises the following tasks:

installing drupal

publishing some content about the conference to the site

creating navigation

creating a new role (speaker) with additional permissions

using taxonomy to categories ‘workshops’

using a block to highlight content on every page of the site

customising the URL of a particular page

allowing speaker role to use images in their content creation

set up site search

There may be some additional tasks that we want to include, however these scripts are already quite lengthy so perhaps it is best to prioritise tasks and we can iterate the script over time as more testing is completed, to increase task coverage.

Test Materials (What to test on)

I think this is the biggest challenge, especially in the case of Verity, and it’s not something for which I have an immediate solution.

Here’s the problem – it is not reasonable to expect Verity to be able to perform the tasks outlined above using a straight Garland Drupal website, and it is not a reasonably approximation to the reality of this end user.

Garland looks far to much like Drupal and far to little like a contemporary and designed website. It introduces a raft of challenges to understanding the interface that Verity would never have to address in using the Seven admin theme over the top of a professionally designed website.

We need a theme for testing, however simple, that looks like a proper, designed website. And, in the case of Verity, we need a little structure and content already in place on the website.

Now, there are dozens and dozens of Drupalcamp websites out there… is there some way we can get one of those approximated in Drupal 7 in a reasonably short timeframe?

Ideally, Jeremy will have a theme other than Garland as his default Drupal 7 theme also. But, for me, this is less problematic compared to Verity.

Help!

Reporting Findings & Making Decisions

Our testing needs to be video recorded (screencast) – I’ll be using Silverback (because it’s cheap and I use a Mac) but there are many other solutions out there. In order for our findings and recommendations to be agreed and accepted it is important that we’re all able to check back to the source.

There is a Drupal UX Group on Vimeo (http://vimeo.com/groups/drupalux) which seems like the logical place to post these videos (to avoid the 10 minute rule on YouTube if nothing else) and it also makes sense to utilise the existing drupalusability.org infrastructure for reporting findings although it will need to be reworked to allow for the ‘crowdsourced’ and iterative nature of this test approach.

Next Steps

Assuming we agree that this is a good way forward, then there are three key tasks that have to be completed:

a) we need to finalise scripts and make them available to people who wish to conduct testing
b) we need to resolve the issue re: a testing theme, and
c) we need to ensure that drupalusability.org is ready to receive feedback from multiple sources

I know that within the UK there are at least two groups who are keen to make a start on doing some testing before the end of September. It would be great to have this framework in place so that we can make the best possible use of their time and expertise, so, time (as ever) is of the essence.