Skip to main content

https://gds.blog.gov.uk/2014/01/15/going-live-with-the-service-standard/

Going ‘live’ with the Service Standard

Going 'live' with the Service Standard

The Digital by Default Service Standard has been running as a beta for almost a year now, but formally comes into force in April. Now that we’ve run over 40 assessments, we thought it was time to summarise what we’ve learned so far.

The Service Standard is what we believe ‘good’ looks like for government digital services. The standard consists of 26 criteria that cover a range of areas essential for building new or redesigning existing digital services. The purpose is to improve the quality of government digital services and support the Cabinet Office spending controls.

Government committed in the Government Digital Strategy that by April 2014 all new or redesigned digital services will meet the Service Standard before they can go live. From April 2014, services going through an assessment will be assessed against all 26 points of the standard.

Getting ready ...

In preparation for April, we’ve been assessing services against the standard at the end of their alpha and beta phases, before they launch in public or move to the next phase. In the year before the standard formally comes into effect, rather than expecting them to meet all 26 criteria, we’ve primarily focussed on whether services meet user needs, can be rapidly improved, and are safe and secure. The reason for this has been to give the teams building services time to get up to speed with working in a new way. But, by April this year, we’ll expect services seeking to move from beta to live to meet all 26 criteria.

We’ve been really impressed with the great work that is going on across government to improve digital services, and we hope that the process has been as helpful for departments as it has been for us. By running assessments against the standard before it comes into full force in April 2014, we’ve been able to learn and improve the process. I want to explain a little bit about what we’ve learnt so far and how we’ve changed the process in response.

Who is involved?

Last year, assessments were led by an experienced product owner with support from a technical architect and an analyst from GDS. This worked well because the people in these roles have a good understanding of government services and the Service Standard - after all they are already building or running digital services to meet the service standard in their day jobs. They were sometimes joined by an analyst and/or a designer on a service by service basis. Since then, we’ve realised just how valuable the support of a designer is to ensure we cover all elements of each service in sufficient detail.

What we've learned

We’ve found that it’s really difficult to know beforehand how long an assessment is going to take. This isn’t surprising since we know that services across government can be very different, some are more complex than others. So, we’ve extended the length of the assessments from 2 hours to 3 hours - this works much better because the services that don’t need all 3 hours finish early but the others don’t overrun.

We’ve learnt that the assessment meetings run much more smoothly when the assessors understand a bit more about the service before the assessment meeting. That’s why we now ask for a description of the service before each assessment including who its users are, what user needs the service is aiming to meet, and a working link to the service.

For services themselves, we’ve found that that there is a real benefit to sharing learning from one service to the next. For example, this might be by sharing code and design from a service that has passed an assessment or by avoiding common pitfalls from services that haven’t passed. To help this learning across different services, we are going to start publishing assessment reports soon.

There may be a few more improvements that we’ll make ahead of April. If you have suggestions, please do comment below.

Follow Mark on Twitter: @Mark_Mc4 and sign up for email alerts here.


You may also be interested in:

Beta testing the Service Standard

Setting open standards for government documents

First open standards selected

Sharing and comments

Share this page

1 comment

  1. Comment by Tom Scott posted on

    Thanks for sharing this Mark, it's great to see processes as well as services are changing/iterating over time as we learn more.