'It's not a choice between privacy or innovation', ICO tells NHS trusts

Written by Sam Trendall on 4 July 2017 in News
News

After Royal Free is found to have breached Data Protection Act, information commissioner Elizabeth Denham offers four-point checklist for other trusts in their use of technology and data

Elizabeth Denham: "No-one suggests that red tape should get in the way of progress" Credit: DCMS

Despite ruling that the Royal Free NHS Foundation Trust breached the Data Protection Act in working with Google DeepMind, information commissioner Elizabeth Denham has told the NHS “it’s not a choice between privacy or innovation”.

The ICO this week published the findings of an investigation that concluded that the Royal Free failed to comply with its data-protection responsibilities in sharing patients' details with the DeepMind artificial intelligence platform during a trial of a new mobile application.

Royal Free, which employs 10,000 people across three hospitals and more than 30 other sites in north London and Hertfordshire, began working with DeepMind in September 2015. Details of about 1.6 million citizens were shared with the AI specialist during a trial programme for a mobile app it had developed, called Streams, that is designed to help with diagnosing acute kidney injuries and alerting patients accordingly.

Having completed its investigation, the ICO has concluded that there were “several shortcomings in how the data was handled”. Such failures include neglecting to adequately inform people that their information was being used in the trial, not providing sufficient opportunity to opt out, and submitting to DeepMind an amount of data that was excessive in relation to the needs of the trial.  

Alongside its conclusions, the regulatory body’s chief has also published a list of “four lessons” other trusts can learn from the case in using data and technology to try and improve clinical outcomes.

The first of these said that, while the positive results generated by Royal Free’s work with DeepMind are welcome, the failures in the trust’s data-handling processes could have been averted.

“The trust has reported successful outcomes. Some may reflect that data-protection rights are a small price to pay for this,” said Denham. “But what stood out to me on looking through the results of the investigation is that the shortcomings we found were avoidable. The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights.”


Related content


The second lesson was that trusts should not “dive in too quickly”. Before beginning any trial, a thorough “privacy impact assessment” should be completed, said the information commissioner. 

Thirdly, Denham advised the NHS that, just because the cloud offers the possibility of processing a much greater volume of data, this does not mean doing so is necessary or prudent. 

“Consider whether the risks to patient privacy are likely to be outweighed by the data-protection implications for your patients,” she said. “Apply the proportionality principle as a guiding factor in deciding whether you should move forward.”

The ICO leader’s final piece of advice was simple: “Know the law, and follow it”.

“No-one suggests that red tape should get in the way of progress,” she added. “But when you’re setting out to test the clinical safety of a new service, remember that the rules are there for a reason. Just as you wouldn’t ignore the provisions of the Health and Social Care Act, or any other law, don’t ignore the Data Protection Act: you need a legal basis for processing personal data.”

Free and easy?
Following the conclusion its enquiries, the ICO has imposed on the Royal Free a four-item to-do list, beginning with ensuring it sets out “a proper legal basis under the Data Protection Act for the Google DeepMind project and for any future trials”. The second measure the trust must take is ensuring it establishes how it will “comply with its duty of confidence to patients in any future trial involving personal data”.

Thirdly, the Royal Free is now required to “complete a privacy impact assessment, including specific steps to ensure transparency”. Finally, it must conduct an audit of the DeepMind trial, the results of which must be shared with Denham – who will be free to publish them. The trust has been asked to sign an undertaking pledging to abide by the law and by the ICO’s orders.

In recent months the Streams trial has progressed from safety testing into real-life use in clinical environments. The Royal Free – which has the ICO’s blessing to continue using the app – claims that the project has already had some demonstrable success in helping reach better clinical outcomes.

In a statement, the trust said it had supported the ICO’s investigations to date and will continue to do so.

“We passionately believe in the power of technology to improve care for patients and that has always been the driving force for our Streams app,” it said. “We are pleased that the information commissioner supports this approach and has allowed us to continue using the app which is helping us to get the fastest treatment to our most vulnerable patients – potentially saving lives."

The Royal Free concluded: “We look forward to working with the ICO to ensure that other hospitals can benefit from the lessons we have learnt.” 

About the author

Sam Trendall is editor of PublicTechnology

Share this page

Tags

Categories

Add new comment

Related Articles

Fast-track scheme for ‘breakthrough’ technology could speed up NHS adoption by four years
4 November 2017

Government puts £86m funding into programme to accelerate health-service adoption of innovative medicines and tech

Which UK councils are leading the ‘urbantech’ revolution?
7 November 2017

Local authorities must invest in AI and smart technology to overcome a multibillion-pound funding gap, a new report claims. Which cities and regions are ahead of the pack?