Streamlined GDS service standard removes need for ministerial tests
Digital agency publishes new 14-point checklist emphasising inclusion and cross-departmental services
Credit: House of Commons/PA Wire/PA Images
The Government Digital Service has published an updated and streamlined set of guidelines for government services, with a sharpened focus on inclusion – but no requirement for subjecting services to ministerial test.
The incoming Service Standard will take effect from 1 July and contains 14 conditions government services must meet in order to pass GDS assessment.
It replaces the Digital Service Standard, an 18-point checklist that has been in place for four years – having itself replaced and refined the first iteration of the standard, which came into effect in 2014 and consisted of 26 points.
Five of the points on the latest version are more-or-less identical to ones featured on the outgoing standard: understand users and their needs; have a multidisciplinary team; use agile ways of working; iterate and improve frequently; and make new source code open.
In other cases, the new standard uses stronger or more definitive language.
- ‘Don’t break things’ and ‘progress is better than stasis’ – GDS sheds light on how the new Government Service Standard is shaping up
- Government implements open standards for cyberthreat intelligence
- ‘Transformation is not something that can just be driven from the centre’ – Q&A with GDS chief Cunnington
The incumbent standard requires developers to “understand security and privacy issues”, while the new rules asks that they “create a secure service which protects users’ privacy”.
The new service standard stipulates the need to “make sure that everyone can use the service”. Previously, the requirement was that developers sought to “encourage everyone to use the digital services”.
The existing requirement to “use open standards and common platforms” has, in effect, been replaced by “use and contribute to common standards, components and patterns”.
Point 12 of the outgoing rules – “make sure users succeed first time” – has similar intent as the fourth point of the new checklist: “make the service is simple to use. Meanwhile, “evaluate tools and systems” is changed to “choose the right tools and technology”.
In other cases, multiple rules have been condensed into one point.
The 2015 iteration dedicated three points to the need to collect, analyse, and report back on performance data.
The incoming version implements “define what success looks like and publish performance data” as a single rule.
The new rules also require developers to “solve a whole problem for users”, “provide a joined-up experience across all channels”, and “operate a reliable service”, which seem to collectively replace several similar points of the outgoing guidelines.
One rule that has been excised is the stipulation that services must be tested by the relevant minister. The new standard no longer imposes an inviolable rule that services must meet with ministerial approval. However, the requirement for the use of agile methods does state that service teams ought to "make sure that the right people know what’s happening" and should, if appropriate, test the service with a minister or other "senior stakeholder".
From 1 July, all services put forward to be assessed will be judged against the new standard.
In a blog post announcing the updated rules, GDS’s content lead for service design and standards Stephen Gill, said that “at some point we think it makes sense for services that are already in progress to transition to the updated Service Standard” – but this is unlikely to be take place until sometime next year at the earliest, he added.
Gill said that a major consideration in the creation of the new standard was coming up with guidelines that would be useful for developers of citizen services across the wider public sector, as well as for those working on internal services.
“The standard might have started life as a tool for central government teams working on public-facing transactions, but now you can use it if you’re, say, working in a local authority – and we’ve made it easier to use with internal or non-transactional services too”, he said.
The GDS content lead said that another key aspect of the updated rules was that they are designed to be applied to “services that cut across departmental boundaries”. The Service Standard could also “help to solve an underlying policy problem”, he added.
But the new framework “isn’t a grand departure”, according to Gill.
“The vast majority of the underlying intent is the same,” he said. “We still care about things like building services iteratively, delivering value to users as quickly as possible, open-sourcing your code and using common platforms. And the trigger for assessments hasn’t changed: you’ll still need to come and talk about the digital service you’ve built, but you’ll be asked some extra questions about the wider landscape around it.”
GDS first announced that it was planning to revamp – and rebrand – the standard back in September 2017. By September 2018, the new standard was nearing a final draft and a blog from Gill revealed that the updated rules would be founded on the goal of designing end-to-end services and offering citizens “a joined-up experience across different channels”.
Annual study from Institute for Government flags up patchy approach to transparency and ‘mixed’ progress on digital transformation
Government responds to PAC report to insist that each department needs its own rules
Cabinet secretary Sedwill says he ‘would like to see more processes handled’ by technology
The body dedicated to upholding ethical standards across the public sector has published a major report examining how to ensure those standards are not threatened by AI and automation