Building a performance platform for GOV.UK
Last summer, Max Gadney and Matt Biddulph briefly joined the GDS team to bring their experience and insight to an early stage product for measuring the performance of digital public services. Max explains how they developed a prototype to demonstrate the potential of visual data.
Richard Sargeant, Director of Performance & Delivery at GDS, asked us to create a prototype for how we might use data to improve public services.
His number one goal was to initially show how GOV.UK usage data could represent success or failure in a way that prompted people to take action.
We were asked to address immediate design decisions ahead of version 1.0 but also to look ahead to future iterations. We were to use real departmental data (predominatly from GOV.UK) in Matt’s path-finding and my visual prototyping.
Our methods – Users, Data and Pathfinding
In order to design anything you need to understand it’s parameters – its users, materials, context etc.
Understanding what the users require is key to developing any software – our aim here was to avoid making any assumptions about what to present, and to ensure our solutions were based on solid evidence and an understanding of definite user needs.
We worked with the data analysts to understand the different roles within the departmental hierarchy and their needs of the data. We constructed a rough ‘organisational mapping’ showing key information routes and customers. At each node were key types of people, such as ‘analysts’ or ‘ministerial advisers’, and made personas of them, concentrating as much on what they need as how they share and act on it.
It became very apparent to us that senior people in the team need clear, concise insights which need little explanation, and it was only the analytics team or product owners that regularly needed the extra layers of detail and functionality.
We needed to understand the data we were working with, including questions like:
- how far back did it go?
- which departments owned what kind of data?
- what time intervals existed across the various data sets?
Crucially, we needed to understand which data sets were required to answer the questions being asked by the main user groups.
We listed all the data we had and created a taxonomy of internal, external reporting needs. This went some way to showing what the common elements were as well as where we would need new types of data.
The first thing we did with any source of data was to extract a representative but manageable sample of the full dataset. We’d then iterate on this sample, creating designs in Illustrator and translating as faithfully as possible into live code. Once we had something plausible, we’d take it back to the owners of the data as soon as possible for feedback.
We created visual screens, working code and design instructions for v1.0, our ‘minimum viable product’.
Below is an overview of what we produced for version one.
a) User-relevant Hierarchy
We based the main structure of pages on the different levels in the organization and their differing needs of detail.
Simple headline messages at the top of the page, which can also be used on large screen displays, with more detail as the user scrolls down.
Horizontal navigation between pages, or dashboards, allowing users to view different sets of information.
This design approach was loosely based on the optician eye-test chart – where everyone can read the letters at the top but the information becomes more complex and requires more time to decipher as you move lower down the page.
b) At a Glance information
Our main goal was to enable action from this product. We wanted the visualisations to only really shout out when there was something to say so we decided that most data display should be neutral (grey) and only go green (good) or red (bad) when something was particularly noteworthy and stood out from the norm.
We also removed any elements which were at all possible to remove without detracting from the usefulness of the graph such as axis lines and markers.
Matt wrote some code that increased the colour the further away from the baseline we got . The result is data display that differs from the visual cacophony of many business intelligence tools where reds and greens compete for attention.
c) Enabling the analytics team – Social Design
Social Design takes the relationships and workflows of users into account. The key ‘insight enablers’ in this team are the analysts and we wanted to add their voice to the product and connect them to the end users through the product.
The ‘why’ and ‘what’ labels do this – echoing questions the users have about this data.
We demonstrated how some of these annotations could be automated by calculating significant deviations from the data set. We help users quickly spot the most interesting points on the graph and help them satisfy their curiosity about the data.
The ‘i’ button also contains functionality that prolongs the life of the data and the learning journey of the users – here we have features that allow them to further publish, interrogate and talk about the findings.
This social design approach encourages people to get the best out of their analytics department – and the data insight platform as a whole frees them up to have the conversations they should be having.
d) Modular design
The Data insight Platform is a collection of dashboards. These dashboard pages do not need to be too crowded as each user would have just the ones they needed. The pages consist of data modules from a library, each with the most appropriate visualisation for the data.
Working with GDS
The team at GDS are focussed on building a product management culture, using agile delivery techniques, developing code in short sprints, providing demos to the whole team, and regularly releasing new code, all backed up with frequent user testing and data analysis.
We developed our ideas, designs and code within the same environment which means this is not a prototype that will remain on the shelf.
The end results of our work have already been handed over to the Data Insight development team who are building our designs and ideas into the first release of the product coming later this year.
Initially the product will provide some basic insights into the performance of GOV.UK, but in the coming months the team will be releasing new modules and dashboards as the variety of data sources and transactional services that use the data insight platform increases.