Why data is key to the success of city devolution
A lot has been promised to the public on city devolution and the price of failure is high. Eddie Copeland, director of government innovation at Nesta, argues that joining up data across those regions is one basic way to give the plans a fighting chance of success.
Greater Manchester's GM-Connect programme is showing promise on data sharing - Photo credit: PA Images
Imagine it’s the day after the mayoral election in a UK city region.
The victorious candidate gathers together the local authorities, the Local Enterprise Partnerships, the public sector bodies and citizen groups for the first time.
“Right everyone,” says the mayor. “City devolution gives us an opportunity to make progress on three things across the region. We’re here to reform local public services, boost economic growth, and increase local democratic engagement. How are we going to do it?”
If we stop and pause on that scene for a moment, we’ll notice that these responsibilities speak to the two great political challenges of our time.
The first is the financial crisis facing the UK public sector; local authorities in England alone face a £12.4bn shortfall by 2020. The second is the rising level of public disillusionment and disengagement with traditional political parties and systems.
How well placed is city devolution to address those challenges?
Returning to the room, the mayor asks: “What’s the scale of demand for services across the region and how are we going to intelligently redesign them?”
Someone responds: “Sorry, mayor. We don’t have the data on that; at least not for the whole region.”
“Well, which interventions to support local businesses are working and should be scaled?”
Again, the answer is likely be: “Sorry, mayor, we don’t have all the data on that.”
With a growing sense of exasperation the mayor pleads: “Well what do the citizens across this region think our priorities should be?”
“Sorry mayor, we don’t have the data to answer that.”
The jigsaw problem
I present this imaginary scene to make the following point: without the ability to join up, analyse and act upon data at a city region scale, I have no idea how city devolution is meant to succeed.
It all comes down to the 'jigsaw problem'.
Simply put, every local authority and public sector organisation has their little piece of the data, but no one can put it all together, take a step back and see the bigger picture. That’s a profound problem if you’re trying to reform public services.
For example, how can cities scale the use of shared services if organisations can’t see how issues, opportunities and demand transcend their boundaries?
How can they target scarce resources at areas of greatest need if they don’t have the data to point to the hotspots on any given issue?
Or coordinate the actions of different teams if they don’t have data on what each is doing?
How can they predict and prevent problems from happening before they become serious and expensive to resolve if they don’t have access to the datasets that correlate with higher risk?
And how can they encourage individuals and organisations to build useful products and services using open data if the data is too fragmented to offer developers a viable business model?
The causes of the jigsaw problem are well known.
There are technical barriers. Legacy IT systems can make it hard to get the data out. Poor practice by some IT suppliers leaves councils being charged for having an API built to extract their own data. Some outsourcing contracts are written such that local authorities do not have access at all.
There are data barriers. Data can be recorded according to different formats and standards, creating the digital equivalent of comparing apples and oranges. Records about one person or place often lack a common identifier, making it hard to connect information across different systems.
There are cultural and organisational barriers. Every public sector organisation was originally established to serve a certain set of people with a certain set of functions. It takes a considerable cultural and structural leap – not to mention the full-hearted backing of leadership teams – to start systematically collaborating with data.
And there are legal barriers, both real and perceived. To the typical service manager, data legislation can be both intimidating and impenetrable. Some things are simply (and rightly) not allowed. But many other data initiatives could be achieved but are assumed to be prohibited and people understandably urge on the side of caution. Without a change to this culture of risk aversion, little will happen.
What’s the solution to overcoming these barriers?
Back to basics
Greater Manchester shows one promising path.
Inspired by the work of the Mayor’s Office of Data Analytics in New York City, the GM-Connect programme is working to put in place the technology, data and legal mechanisms to complete the region’s own jigsaw.
With a long list of projects in the pipeline, an initial pilot has focused on creating a “child passport”: federating intelligence so that all agencies have a single view of what is known about vulnerable children. It’s in all our interests that they succeed.
Meanwhile at Nesta, we’re conducting pilots in London, the North East and Sheffield to establish the core principles for how an Office of Data Analytics model can be adapted for each region’s needs.
Our hope is that these initiatives can help correct one of the most misleading messages to come out of the smart city movement: that the route to harnessing city data starts with putting in place new digital infrastructure – Internet of Things networks and the like.
All too often that just means cities procuring new technology that gives them even more data that they have absolutely no idea how to use.
Don't get me wrong: technology has a huge amount to offer, but cities need to walk before they run. That takes three steps.
First, they need to put in place the capability to bring together, analyse and act upon the vast quantity of data that already sits within the public sector. They must develop the necessary skills. They must create the legal structures to share and process data.
They must ensure the requisite leadership and culture are present to embed the use of data into every aspect of how they make decisions and deliver services. This is the purpose of the Office of Data Analytics model.
It’s not as glamourous or quick as implementing a new tech platform. But if city regions can’t even succeed at this stage, they are highly unlikely to get much value from the additional data that new technology would provide.
Second, they should seek to benefit from existing datasets that sit outside of government. Instead of merely publishing their own data on open data portals, they should also publish the problems they are trying to solve, and invite businesses, charities, universities and citizens to provide data that helps address them.
Finally, if no existing datasets can solve an issue adequately, then cities should indeed look to technology to fill the gaps. Having exhausted the potential of the data they already have, there will be a much clearer business case for where it makes sense to spend public money on new tech.
Keeping up with expectations
A lot has been promised on city devolution. Done well, it really could have the potential to improve public services, boost economic growth and improve democratic engagement. But if cities fail to meet those expectations, it risks leading to yet more public disillusionment. That, we can ill afford.
There are many routes to solving this problem.
But given the pivotal role that data will play, there are many worse places to start than by putting in place the basic data mechanisms to give the new generation of elected mayors – and all those who will work with them – a fighting chance of success.
Vast majority of doctors in the region buy into the 'Geordie nation concept'
Doctors also set 30% target for amount of patients using online services
Department looks to blend in-house with external teams as support contract with existing supplier approaches conclusion
Office for National Statistics to spearhead new civil-service profession
BT's Mike Pannell on the different ways of anonymising information and their application to IoT data
BT's Malcolm Stokes explains how organisations can attribute accurate figures to cyber risks in order to make a viable business case.
BT's Ben Azvine argues that the frequency and impact of breaches is increasing and we need to continuously adapt and innovate to stay ahead of the threat environment
BT has a team of over 2,500 security experts working to maintain the highest standards. Here we meet some of them and find out what they do.