Court finds police use of facial recognition unlawful
Appeal court rules in favour of privacy campaigners
Credit: Teguhjati Pras from Pixabay
The Court of Appeal has ruled that use of automated facial recognition by South Wales Police over the last few years has been unlawful.
The force first began using the technology at events and in public spaces in May 2017. The AFR Locate system deployed uses cameras and software to match captured images of passers-by against people featured on criminal “watchlists”.
One of those whose image was captured by the cameras was Cardiff man Ed Bridges who, supported by human rights campaign organisation Liberty, challenged South Wales Police in the High Court. Their claim contended that the use of facial recognition compromised Bridges’ right to a private life and should thus be considered unlawful.
The court ruled in favour of the police, concluding that its deployments of AFR had been lawful and proportionate.
Bridges and Liberty appealed this ruling on five grounds.
The Court of Appeal has ruled in favour of the privacy campaigners on three of these – including the first, in which it was found that the original ruling had “erred in concluding that SWP’s interference with Mr Bridges’s rights… was ‘in accordance with the law’”.
Appeal judges also agreed with the appellants that the police had not provided evidence of an adequate data protection impact assessment in respect of the AFR deployment.
The force also failed to comply with its obligations under the Public Sector Equality Duty, the appeal court found.
“SWP erred by not taking reasonable steps to make enquiries about whether the AFR Locate software had bias on racial or sex grounds,” the judges said. “The court did note, however, that there was no clear evidence that AFR Locate software was in fact biased on the grounds of race and/or sex.”
Despite finding that the use of facial recognition had not been lawful, the court did not uphold Bridges’ claim that the technology had constituted a disproportionate interference with his right to a private life under article 8 of the European Convention on Human Rights.
It also ruled in favour of the police on the issue of whether it prepared an “appropriate policy document” in relation to the Data Protection Act 2018 – an issue on which the High Court had not concluded one way or the other.
“The court held that the DC (Divisional Court) was right to not reach a conclusion on this point because it did not need to be decided,” the appeal judges said. “The two specific deployments of AFR Locate which were the basis of Mr Bridges’s claim occurred before the DPA 2018 came into force.”
Following the verdict, Ed Bridges said he “was delighted that the court has agreed that facial recognition clearly threatens our rights”.
“This technology is an intrusive and discriminatory mass surveillance tool,” he added. “For three years now, South Wales Police has been using it against hundreds of thousands of us, without our consent and often without our knowledge. We should all be able to use our public spaces without being subjected to oppressive surveillance.”
'A major victory'
Liberty lawyer Megan Goulding welcomed the appeal court’s ruling, but said that “it is time for the government to recognise the serious dangers of this intrusive technology” and ban it entirely.
“This judgement is a major victory in the fight against discriminatory and oppressive facial recognition,” she said. “The court has agreed that this dystopian surveillance tool violates our rights and threatens our liberties. Facial recognition discriminates against people of colour, and it is absolutely right that the court found that South Wales Police had failed in their duty to investigate and avoid discrimination.”
South Wales Police does not intend to challenge the Court of Appeal judgement.
Chief constable Matt Jukes said that the courts’ examination of how facial-recognition technology is used by law enforcement is an “important step in its development”.
“I am confident this is a judgment that we can work with,” he said. “Our priority remains protecting the public, and that goes hand-in-hand with a commitment to ensuring they can see we are using new technology in ways that are responsible and fair.”
The chief constable said that the force will give “serious attention” to the findings of the court.
"I am confident this is a judgment that we can work with. Our priority remains protecting the public, and that goes hand-in-hand with a commitment to ensuring they can see we are using new technology in ways that are responsible and fair."
South Wales Police chief constable Matt Jukes
“[The] judgement helpfully points to a limited number of policy areas that require this attention,” he said. “Our policies have already evolved since the trials in 2017 and 2018 were considered by the courts, and we are now in discussions with the Home Office and Surveillance Camera Commissioner about the further adjustments we should make and any other interventions that are required.”
Jukes added: “We have already placed further focus on one concern. As our work with facial recognition has been developing, international concern about potential bias in algorithms has grown. We are pleased that the court has acknowledged that there was no evidence of bias or discrimination in our use of the technology. But questions of public confidence, fairness and transparency are vitally important, and the Court of Appeal is clear that further work is needed to ensure that there is no risk of us breaching of our duties around equality. In 2019 we commissioned academic analysis of this question and although the current pandemic has disrupted its progress, this work has restarted and will inform our response to the Court of Appeal’s conclusions.”
With many around the country receiving technological gifts, experts from government anti-espionage unit UK NACE explain why smartphones are the ‘perfect eavesdropping devices’
CMA created team last year to better understand and oversee the use of automated technologies in business
Report claims efforts led by advertising firm will aim to stoke concern among parents and could feature public stunts designed to alarm passers-by
Leader of the House of Commons tells committee that his parliamentary colleagues ‘do not necessarily take electronic voting as seriously’