Our user testing approach

Ellie Harries, Associate at NPC, talks about the importance of user testing websites and our approach to user testing impactsupport.org.

It’s been two and a half months since the launch of the brand new impactsupport.org website, and we’ve been itching to do user testing to make sure it hits the mark. The content, look and feel of the website has all been co-designed with over 100 charities and social enterprises across the country - and now it’s launched we’re testing it all over again.

User testing is an integral element of website development, and from past experience of quite a few website builds over the years, I can firmly say it is an absolutely essential step. Real website users see it and say it as it is, whereas the people tasked with building and managing digital products often become so close to them that they lose their objectivity. This blog explains how we’ve approached user testing the website so far and includes some initial findings.

User testing

User testing is like any piece of research, and the first step is to clearly define the research questions. For us right now we wanted to know:

What are users’ first impressions of the website?

Can users find the resources they want to improve their impact management?

Does the navigation on the website make sense?

Is the data diagnostic easy to answer and are the outputs useful?

What do users like?

What would they change?

Not a short list!

Once our research questions were confirmed we developed a script to help structure the tests we wanted to run with individual users. We also developed a range of activities to run as group exercises. These took quite a bit of work to develop and were all based on our research questions.

The next step in user testing is to identify the target audience and recruit testers with the “right” demographics who can usefully critique a website or product.

The audience for the impactsupport.org is people who work for charities and social enterprises and actively want to improve their organisation’s impact management. Luckily for us the Impact Management Programme was set-up with users at its core, and we run regular peer learning events in six regions of England. Event attendees are a ready made, engaged and enthusiastic audience, so the events are a perfect opportunity to get the website in front of real people who are interested in the topic and would benefit from the content

The good and the bad

The first user testing session took place in Liverpool a few weeks ago and proved very insightful. Some of the feedback has been fab. We heard that:

‘The website is friendly, inviting and structured’

‘There is lots of useful resources and information’

‘It is well-written and provides the guidance and understanding on the subject which people need.’

We also got some constructive feedback too:

‘The website maybe a little simplistic but still very helpful!!’

‘The writing is too small’

We’re currently collating and analysing the feedback, and we’ll be doing more user testing at other peer network events. But even the little bit of testing we’ve run so far will allow us to learn, iterate and tweak - and ultimately make the impactsupport.org website better. Like any charity or service provider, we are accountable to users and need to adapt according to what works and what people want. This is impact management in practice.

For anyone interested in further resources on digital user testing, check out: