Testing GOV.UK with real users
On Tuesday, Sarah Prag wrote about some changes being made to GOV.UK based on feedback from usability testing. We wanted to share a few more details about what we’ve been doing; the methods we’re using, and the people we’ve been talking to in the run up to release.
“Putting user needs at the centre of everything we do” is one of our guiding principles at GDS, so it’s our job to make sure that we test GOV.UK with real end users, of all abilities, from across the UK.
Usability testing was carried out on Alphagov, and in the Beta site earlier in the year. Since then, more and more content has been added to the site and this has enabled us, in the last few weeks, to test much more realistic user journeys. Of course we’re finding a few issues… and fixing them – there’s nothing like seeing a member of the public struggling to find something on your website to help convince you that it needs to be different!
How we’re testing
Our approach has been to use several research methods to help give us the big picture – by this we mean in depth, qualitative lab-based testing, larger scale quantitative remote usability testing, ‘guerilla’ testing small parts of the site with “volunteers” and specialist interviews with stakeholders from professional groups and trade organisations. We’ve also been capturing Key Performance Indicators (KPIs) from Directgov and Business Link that will allow us to make a fair comparison with GOV.UK once it’s fully launched.
Quantitative testing gives us a lot of data to work with, so we can see how people move through GOV.UK, and Nick will tell you more about that method soon. Smaller, one-to-one methods like lab based testing gives us real, qualitative information about how people respond to the website as well as how they use it.
How lab-based testing works
In the last 2 months we’ve conducted 4 rounds of lab-based testing, in London, Manchester, Birmingham and Glasgow. In each case we recruited 12 people (a standard number for this type of testing) who represented end users of particular areas of GOV.UK.
For example, we tested business content with small business owners, and detailed international trade content with people working for companies who regularly import and export goods. We tested citizen content with people with young families, older people approaching retirement, people who are working, people who are unemployed and low income workers on state benefits.
In every round of testing we’ve aimed to include at least 2 people with disabilities, but we’ve also tested the website with people with a much wider range of access needs.
This testing has enabled us to identify several key usability issues which are being fixed before we launch GOV.UK, as Sarah mentioned in her post. All of the testing is cross-referenced, so we make sure we aren’t just taking a few people’s word for it – often, if we find issues in the one-to-one interviews, we’ll see there’s data that backs that up.
We don’t expect to stop testing once the site is launched and millions of people are using it every month – in fact we’re expecting to be working closely with our colleagues in analytics and data insight.
Once the website data starts telling us what the users are doing, we’re pretty sure our colleagues will want to know why – and that’s where the Insight and Usability Team can help.