Government review calls for mandatory transparency in public sector use of algorithms

Laws must also be updated, according to Centre for Data Ethics and Innovation 

Credit: Pixabay

A government-led review has recommended the implementation of a “mandatory transparency obligation” for all public-sector entities using algorithms to make decisions that impact citizens.

The Centre for Data Ethics, which was set up by government in 2018 to advise on ethical issues related to data use and artificial intelligence, this week published the findings of an 18-month review into bias in algorithmic decision-making.

The centre picked out three key recommendations, the first of which is that any use of algorithms by the public sector should be subject to openness requirements.

“Government should place a mandatory transparency obligation on all public sector organisations using algorithms that have an impact on significant decisions affecting individuals,” CDEI said.

The government should also update anti-discrimination legislation to take account for how it might apply to the use of algorithms.


Related content


“Government should issue guidance that clarifies the application of the Equality Act to algorithmic decision-making,” the review said. “This should include guidance on the collection of data to measure bias, as well as the lawfulness of bias-mitigation techniques – some of which risk introducing positive discrimination, which is illegal under the Equality Act.”

The third recommendation made by the centre applies to entities across all industries.

“Organisations should be actively using data to identify and mitigate bias,” the CDEI said. “They should make sure that they understand the capabilities and limitations of algorithmic tools, and carefully consider how they will ensure fair treatment of individuals.”

The CDEI review focused on the use of algorithms in four sectors: financial services; local government; policing; and recruitment.
Research conducted in the course of the review found that six in 10 citizens are aware that algorithms are used by organisations in decision-making – but only three in ten said they were aware of their use in local government.

There is widespread support for using data – including information on ethnicity and sex – to be used to tackle issues of bias, the research found.

According to the CDEI, “the review points to the need for an ecosystem of industry standards and professional services to help organisations address algorithmic bias in the UK and beyond.” 

“To catalyse this, the CDEI has initiated a programme of work on AI assurance, in which it will identify what is needed to develop a strong AI accountability ecosystem in the UK,” the centre added. “Other related CDEI work includes: working with the Government Digital Service to pilot an approach to algorithmic transparency; supporting a police force and a local authority to apply lessons learnt and develop practical governance structures; and active public engagement to build understanding of the values that citizens want reflected in new models of data governance.”

For its part, the government needs to play a role of “leadership and coordination”, and the report “urges the government to be clear on where responsibilities sit for tracking progress”.

Adrian Weller, board member for the Centre for Data Ethics and Innovation, said: “It is vital that we work hard now to get this right as adoption of algorithmic decision-making increases. Government, regulators and industry need to work together with interdisciplinary experts, stakeholders and the public to ensure that algorithms are used to promote fairness, not undermine it. The Centre for Data Ethics and Innovation has today set out a range of measures to help the UK to achieve this, with a focus on enhancing transparency and accountability in decision-making processes that have a significant impact on individuals. Not only does the report propose a roadmap to tackle the risks, but it highlights the opportunity that good use of data presents to address historical unfairness and avoid new biases in key areas of life.”

 

Sam Trendall

Learn More →

Leave a Reply

Your email address will not be published. Required fields are marked *