‘Innovation is problematic when it is behind the scenes’
Jeni Tennison, CEO of the Open Data Institute, talks to PublicTechnology about the organisation’s work with government and how to balance risk and reward in the use of open data
“We must make data as open as possible while protecting people’s privacy, commercial confidentiality and national security.”
These words form part of the vision of the Open Data Institute, as defined on the organisation’s website. Founded in 2012 by Sir Tim Berners-Lee – the inventor of the worldwide web – the ODI is a non-profit entity that works with government and industry to promote change via the use of three “levers”: open data programmes targeted at specific sectors; advocacy and the provision of data tools; and convening networks of peers to share knowledge.
At the ODI’s annual summit in London late last year, PublicTechnology caught up with the ODI’s chief executive Jeni Tennison (pictured below left) – who formerly served as lead developer on legislation.gov.uk – to find out more about the organisation’s role in the public sector, how government can support the use of open data, and whether innovation and ethics are always at odds.
PublicTechnology: How does the ODI work with the public sector?
Jeni Tennison: We are really interested in how to help people, communities, and organisations make decisions better with data. We have two main ways in which we interact with the public sector. One is helping public sector organisations themselves – particularly on how they use data internally for evidence-based decision making, and how they think about data in the ecosystems that they work in, and influencing the way those ecosystems work. An example is a project we did last year looking at open data in the public sector, in which we provided stimulus funds to four local authorities, helping them to explore how to use open data, including using health data in Kent, and using data in Doncaster to help children choose career paths and education. So, we really work at the practical level with public sector organisations. The other level we work with government is around regulation, and around what kind of policies, stimulation, and regulation you should have around data – the state of our National Data Strategy, for example, or what the Geospatial Commission should do. So, at that policy level, we work with central government and regulators on how to create a data ecosystem that works for people.
Broadly speaking, how is central government doing in terms of recognising – or even realising – the value of the data it holds?
Government has gone through a number of changes. Data policy when we first started was led by the Cabinet Office, and was very much focused around transparency and accountability. It was very much a focus on what government data should be made available to the rest of society. I think there are two other streams that have come in over time. One is a focus within government on how they use data… how they do so ethically, how can they increase capability. Then the one that we have been really pushing for, and trying to get higher in people’s minds, is the role of data in our society more generally, and government’s role in shaping the way the data ecosystem works, through policymaking and regulation. DCMS are now working on the national data strategy. That is within a context where the importance of data is increasingly recognised – in order to unlock the benefits of AI and all of the other emerging technologies, drones and driverless cars. It is no longer just about government putting data out, but what is the role of data in our society? That is a real shift that there has been.
"It is very easy to bring technologists in, and technologists will get very excited about being able to bring lots of data together. I actually think we need to start by looking at the decision makers that we want to be using more data and evidence."
Do you thinkgovernment can and should play a more active role in shaping and regulating the data ecosystem?
Yes, but I think it is not just about regulation – there is a whole bunch of things government can do to help nudge the data ecosystem in the right direction. So, for example, the work that we are currently doing with Ofgem is around creating open standards in the energy sector, so that there is more interoperability around data, so that people can build applications more easily. That is not regulation, that is not hard and fast rules – that is just playing a stimulation role. Other pieces of work we have been involved in where government has committed money to drive innovation in the sector, such as money to fund start-ups or competitions and prizes – those are other ways of nudging the ecosystem in the right kind of directions without implementing hard regulations.
What do you see as the biggest challenges to the government making the best use of its data – are they technical, procedural, or cultural?
I think there are technical challenges and cultural and capability challenges. Technical challenges include the lack of data standards – or the lack of adoption of data standards, in some places. Data quality tends to be an issue – not just a government issue, an issue in every organisation that we encounter they say ‘our data is of terrible quality’, and the reason for that is that it is not being used sufficiently to make it, and the mechanisms aren’t there to improve it. That is the technical side. I think from a cultural side, part of it comes down to capability – but in its broadest sense. There is obviously capability inside the public sector to use and to analyse data and there is data science capability. But there is also the broader issue of having people that understand the role of data – and digital more broadly – in the public sector, and in society. I think that is about training policymakers and other decision makers so that they understand where data is useful, how it is useful, and how you can use it as a tool. They do not need to understand how to make visualisations, or analyse data – they need to understand it at this other level.
Is that a symptom of government siloes, and the need for data scientists to work with policy and operational people?
The standard digital cross-functional team plays a part here – but it is also at those higher decision-making levels, the perm-sec level and the director level, that set the context for ‘we should be thinking of using data and digital here, and this is what that looks like’.
For a smaller public sector body – perhaps a council or NHS trust – that wants to make better use of its data, where do they start?
If you take a very data lens to that, you are probably starting in the wrong place. It is very easy to bring technologists in, and technologists will get very excited about being able to bring lots of data together. I actually think we need to start from a place where you look at the decision makers that we want to be using more data and evidence. Where do they get their information from? And what data is that information based off? And where does data get collected, used and shared? I think the first step is really identifying what are the actual problems that are relevant to the goals of the organisation, and who are the decision makers who need to be influenced. And then working backwards to ‘what is the data that we need to help them?’. We have a tool called the Data Ecosystem Map which is a visual language for laying that all out – for [demonstrating who] are the decision makers, this is where their information comes from, and these are the organisations that are supplying and collecting the data. From that you can identify where the blockers are, and where things aren’t working as smoothly as they can. I think taking that purpose-led or goal-oriented approach to getting data within the organisation is the way to do it.
It often seems like a battle between, on the one hand, innovation and, on the other, accountability and ethical use of data. Do you think those two things are inherently opposed?
I quite like the analogy that having good brakes allow you to go faster in a car. But there are also lots of other things around you when you’re driving that help keep you safe – like speed limits! I think we have to ask ourselves what kind of innovation we want, and the degree to which we are happy to take risks and, where we are happy to do so, how do we put in place the detection and monitoring framework that means we can detect when things are going wrong. For example, if we are dealing with decision-making systems that are helping to diagnose people in hospitals, we can see what the benefits might be – speedier diagnosis, and having a better handle on what kind of treatments are actually going to work. But, to make those systems acceptable, we have to put around it trustworthy data governance [for] the people that are affected. If you do that, then you get permission. I think innovation is problematic is when there is innovation behind the scenes that suddenly pops up and people don’t know about it. Anything that gets hidden, people worry about. Innovating in the open enables innovation to happen rapidly. None of us can predict what happens with technology, none of us can predict the outcome of deploying a particular algorithm in a particular use. So, having the checking mechanisms and monitoring mechanisms – so, in that example of a diagnosis, actually seeing what are the results of this algorithm, and how does this map onto particular groups of people – are they getting different treatment because of characteristics that we don’t want to be used in an algorithm? How do we monitor that, how do we make sure that it is working? And, when we monitor it, then we can adjust rapidly.
Is that something that should be led by regulation, or just self-policing work?
I don’t think there is a one-size-fits-all. There are some cases, such as the NHS, where the use of data is affecting life-and-death situations, I would expect that to be [a case of regulatory] monitoring. In something where the implications going wrong are much less stark, then some self-regulation is going to be fine. We also need to look at the role of third-sector organisations and consumer rights groups as transparency and accountability mechanisms, so that the ecosystem works together.
Agreement will also see US agencies permitted to requisition data from UK telcos
The potential for technology to embed and amplify systemic biases is seen as one of the biggest inherent risks of deploying AI and automation at scale. PublicTechnology talks to experts...
Brandon Lewis says ‘immigration rules will apply’ for those who fail to complete digital application
Webinar discussion – which is available to view for free – covers ethics, technical barriers, and key use cases of artificial intelligence