How the police wants to use AI and analytics to ‘adopt a public-health approach to crime’
Critics have claimed that an analytics project being led by West Midlands Police is bringing the dystopia of Minority Report to life. But superintendent Iain Donnelly tells PublicTechnology that he is determined to use tech and data ‘as a force for good’
Credit: Ben Birchall/PA Archive/PA Images
A significant step on the road towards the UK becoming “an inevitable dystopian hellscape”, is how one article characterised the National Data Analytics Solution (NDAS) project being run by the West Midlands Police.
Another headline claimed that it marks a “descent into state surveillance”.
One publication asked whether “Minority Report is now a reality”, while another site described the scheme as “Minority Report-lite”, and yet another called NDAS “the real Minority Report”.
Style magazine Dazed at least mixed things up a bit in terms of its cinematic references, suggesting that the project had “Robocop vibes”.
Even the website of celebrity conspiracy theorist David Icke was moved to post about the programme, labelling UK law enforcement as “the thought police”.
Iain Donnelly, superintendent of West Midlands Police and the project manager for NDAS, has seen all these headlines – and even shared many of them online.
“They say any publicity is good publicity, but I'm not sure that being on David Icke's website is where I would have wanted to be,” he notes on LinkedIn.
NDAS, the project in question, is a proof-of-concept exercise to define and design a predictive-analytics offering that could potentially be rolled out across UK law enforcement.
"This is not about creating predictive insights which allow the police to start arresting people or even, for that matter, knocking on doors. This is about adopting a public-health approach to crime – and recognising that the police cannot arrest or enforce their way out of this situation."
When asked by PublicTechnology to characterise the scope and aims of the programme, Donnelly opts to first “deal with what it is not about”.
“Any use of predictive analytics in the UK will never be about looking at the general population, or looking at people going about their daily business,” he says. “This will always be about helping us make sense of increasing volumes of data – that we are already lawfully in possession of – in a way that helps us make better decisions.”
Donnelly adds: “At the moment, we do things with a combination of our professional judgement and gut instinct. Everybody has a different view, and what works can be quite subjective. Whereas we want to let the data tell the story, and a make a more objective assessment.”
NDAS has its roots in a project called Data Driven Insights, which was embarked upon three years ago with the backing of £16m in funding from the West Midlands Police and Crime Commissioner.
The scheme – which has seen the force bring together intelligence data sets with information from command and control and other operational sources – was launched with the aim of allowing officers to “access multiple systems from a single screen [and find out] everything there is to know about a person or location”, Donnelly says.
“The intention of DDI was to bring together all of the data that sits within West Midlands Police to allow officers to [access] all the data they need to make decisions on the front line,” he adds.
The programme has even seen the force establish its own in-house data-science lab, and employ “a team of full-time data scientists”.
On the back of the initial successes of this data-driven approach, in 2016 West Midlands Police submitted to the Home Office a business case for what would become NDAS. The department awarded the force with £4.5m funding on 1 August 2018, and work is now underway – in partnership with key technology and services supplier Accenture.
The programme was conceived as a means to combat “the issues that pose the greatest threat” to police forces, Donnelly says.
Such issues can, broadly, be summed up as a need to do more with less. In fact, a lot more with a lot less.
“Police resources are under greater stress than they have ever been before,” Donnelly says. “There has been a significant decrease in resources coupled with a significant increase in demand.”
The diminishment of resources comes as a result of a huge reduction in central government funding in recent years, Donnelly says. The growing demand, meanwhile, can be attributed to the fact that “the offending landscape has massively changed”.
“There are still all the crime types that we have historically had to investigate – such as robbery, burglary, and assault – but there is now also a lot of internet-facilitated offending,” he adds. “If we are not seen as competent with digital crime, then we are seen as irrelevant.”
In addition to the rise of digital crime methods, Donnelly says, there has been an increase in officers being called upon to act “as the organisation of last resort” and intervene in circumstances that would be better served by mental health or social care organisations that, like the police, have seen their budgets put under ever-greater pressure. But, unlike the police, cannot offer round-the-clock services.
Moreover, there has also been a rise in awareness of crimes such as domestic abuse, modern-day slavery, and sexual exploitation – which, in turn, has seen an increase in the frequency with which they are reported, according to Donnelly.
“There is greater willingness of people to discuss them and come forward – which is brilliant, and we want to encourage that,” he says.
‘A public health approach to violence’
Essentially, the thesis being tested by the NDAS programme is that better analysis and interpretation of data could allow the police to manage and direct its scarce resources as effectively as possible. On top of which, spotting the patterns inherent in serious crimes – and the people who commit and fall victim to them – could help prevent them in the future.
The NDAS programme will examine the potential impact of predictive data analytics in three use cases: serious violent crime; modern-day slavery; and police workforce management.
The data sets covered by the programme includes details of all relevant interactions with the criminal justice system – including information on crime recording, crime reporting, convictions, and custody data – as well as command and control, crime intelligence, and some internal police HR data.
The serious violence use case – which was mandated by the Home Office as part of the funding award – will see West Midlands Police work with data scientists from central government.
“We are looking to identify those individuals who are deemed to be at the highest risk of escalating their behaviour to become perpetrators of gun and knife crime. We are doing the ‘who’, the Home Office, is doing the ‘where and when’,” Donnelly says. “In terms of the ‘how’, what we are going to do is look at a cohort of individuals who have already got convictions for gun and knife crime – many thousands of people – then we will let the data tell us what are the key predictive indicators on the journey they have been on to take them to the point where they stab someone or shoot someone.”
Number of forces collaborating on NDAS
Home Office funding for the project
Serious violence, modern-day slavery, and workforce wellbeing
The three use cases on which predictive analytics is being tested
Rise in UK knife crime during the 12 months to the end of June 2018, according to ONS statistics
Approximate number of emergency calls made each day to West Midlands Police
The intention is that, by identifying such “indicators” of an individual being on a path towards gun or knife crime, this will allow people that could be on a similar journey to be awarded a “risk score”. The police – or, more likely, other agencies, such as social services – could theoretically then intervene to offer appropriate support.
The concept of these interventions is, without doubt, the NDAS programme’s most controversial aspect, and is the basis for all the Minority Report references.
Donnelly’s first response to the fears inherent in such comparisons is to stress that the programme is still only in the proof-of-concept stage. The results of the project will be judged objectively, he says, and NDAS is not “doomed to succeed”; if the scheme does not offer sufficiently valuable insights, it will not be pursued beyond these exploratory stages.
The West Midlands superintendent also says that an escalation in the volume or severity of crime cannot be solved simply by an equivalent escalation in punishment. New approaches are needed, he believes.
“The other really important point to make here is that this is not about creating predictive insights which allow the police to start arresting people or even, for that matter, knocking on doors,” Donnelly says. “This is about adopting a public-health approach to crime – and recognising that the police cannot arrest or enforce their way out of this situation.”
He adds: “Serious violence is increasingly being talked about as a public health issue – it is not something that is inevitable. It could be treated like an illness, and there are multiple factors in terms of the forces that drive serious violence. Some of it will be around policing, some of it will be mental health and [other services].”
The data being taken in by NDAS is limited to information from nine law-enforcement agencies, including the West Midlands Police, as well as forces from Merseyside, Greater Manchester, West Yorkshire, and London’s Metropolitan Police Service. The four other forces involved are not being publicly named.
If the programme proves successful, Donnelly believes that, in the future, data from a range of sources – including health, education, and social services – could be assessed in the round to deliver greater insights.
He says: “There are multiple factors that conspire to create an environment where violent crime is more likely to happen. You could put them in a big pot and call that ‘deprivation’, and you could envisage a scenario where all those factors bring their data together – not to spy on people or be a big brother, but to be a benevolent force for good. To help people escape from drug dependency. To get them into education. To get them out of a life that is depressingly predictable.”
Donnelly adds: “The reality is that violent crime is a very complex issue. But it has got to be in everyone’s interest to try and stop young men from killing each other. We are talking about young men with their whole lives ahead of them, and it makes for really tragic reading when you see the number of young men affected. If we can stop even a handful, [then it will be worth it].”
The modern-day slavery use case will see NDAS process data related to lots of previous cases to try and determine indicators that someone is being trafficked or forced into a life of bonded labour, organised crime, or sex work – or at risk of being so. According to Donnelly, the overarching question being posed is “what does modern-day slavery look and feel like?”.
"Any use of predictive analytics in the UK will never be about looking at the general population, or looking at people going about their daily business. This will always be about helping us make sense of increasing volumes of data – that we are already lawfully in possession of – in a way that helps us make better decisions."
“It is such a hidden crime – people do not come and tell us about it, because of fear, or language barriers or culturally if, in their own countries… people have a distrust of the police,” he says. “Rather than waiting for people to come and tell us about stuff – which they are not going to do in most cases – we will let the data tell the story.”
The final use case concerns the “workforce wellbeing” of police officers and staff. The goal of NDAS’s work in this case will be to identify “the factors that are predictive of an individual going off long-term sick”.
The theory is that, by being able to identify the factors that are causing stress, anxiety, or depression – which could include workload, or exposure to traumatic situations – the police will be able to offer its employees increased support, or even a change of role, before the mental-health impact of their current job leaves them unable to work.
A biased view?
Perhaps the gravest of the common concerns people have about the use of analytics and AI in a law-enforcement environment is that, rather than eliminating the possibility of human prejudice and preconception, it could simply reinforce incumbent biases.
Donnelly says that the risk of reinforcing bias through the use of predictive analytics is one he and his team “are very mindful of”, particularly in the use case around serious violence. He claims that steps are being taken to ensure that the data – and the conclusions it points towards – are as free of bias as possible.
“We are aware that, when we do something around predictive analytics in this space, there is a strong possibility that there will be an over-representation of individuals from BME communities being identified,” he says. “To be as transparent as we possibly can, at the technical level of data, we are not including ethnicity in the data. We are excluding anything to do with ethnicity in the analytics. We are even considering stripping out anything to do with geography, to try and stop the possibility that we are reinforcing a geographical bias.”
To ensure transparency and a keen awareness of ethical issues, the programme is also engaging with a range of external parties including defence and security think tank the Royal United Services Institute, and professor Martin Innes – director of Cardiff University’s Crime and Security Research Institute. NDAS is also consulting “various independent ethics groups, to be as fair and data-driven as possible”, Donnelly says.
The ongoing first stage of the NDAS project includes the construction of a technical platform, the collection and collation of the relevant data sets, and their subsequent analysis by data science professionals.
Another issue facing NDAS – one that may seem more prosaic, but that Donnelly picks out as “the biggest challenge of this project” – is something that will be all-too-familiar to anyone with even a cursory knowledge of the public sector technology and data landscape.
“There is a huge reluctance from police force to police force to share data,” he says. “In an ideal world, we would want to be bringing the data into our AWS cloud platform. But there is still huge nervousness about using cloud.”
The second, and concluding, stage will be to put together a business case that proposes to the Home Office how a nationwide predictive analytics offering – for use by all 43 forces across England and Wales – would work.
Even if NDAS can successfully answer any ethical, legal, and practical concerns people might have with the programme, PublicTechnology puts it to Donnelly that you do not necessarily need sophisticated analytics to predict that deprivation is linked to crime, that violence escalates from fists, to knives, and onto guns, and that a stressful, traumatic job causes stress and trauma.
“That is a very good point,” he says. “This is a proof of concept; it is about a chance for us to ask the question: is what this produces better than the combination of gut feel and professional intelligence that we have already? We have a number of very knowledgeable people, we have all sorts of really good stuff going on already, and I would suggest – in the vast number of cases – we get it right. But that is a very subjective process.”
He adds: “We have never looked at the data and [brought together] multiple data sets and asked: will this enable us to do something we haven’t done before? The answer might be ‘no’, and if it doesn’t give us much more than what we did already, we might just give that [data] to a seasoned offender manager. But, until we do this, we are not going to know.”
As we conclude our conversation, Donnelly returns to once again to confront people’s fears and criticisms of the project, and to reassert that NDAS is founded on nothing more sinister than a desire to prevent the irrevocable damage wrought on the lives of those who commit or suffer crime.
“We want to be as transparent as we possibly can be," he says. “We know that there is a lot of distrust of this technology, and the reason why there is so much distrust is that there are governments and regions that are using AI for social control. Some of the things that Chinese [authorities] are doing – building visual databases of every one of their citizens – are examples that are genuinely concerning.”
He adds: “The desire of law enforcement in the UK is to use this a force for good. I seriously hope that when people do see what we are doing they will see that this is, hopefully, giving us something that will stop bad people from doing bad things.”
CMA created team last year to better understand and oversee the use of automated technologies in business
Staff will either become civil servants or move over to commercial providers
Data watchdog issues warning in light of reports that a digital clean-up had been encouraged
Study finds that more than half feel more exposed to attacks