Google DeepMind NHS Test Ruled to Break Privacy Laws

Latest UK Health & Medical News »

Wednesday 5th July 2017


The Information Commission Office (ICO) have ruled that a UK hospital that shared data with Google did not do enough to protect patients’ privacy as part of a test and diagnosis system for acute kidney injury (AKI) and that there were “several shortcomings” in the way patient data was handled.

The ICO reported that among other information governance failings, the Royal Free NHS Foundation Trust did not do enough to protect patients’ private information nor told 1.6 million patients enough about how the data would be used, which is illegal under the terms of the Data Protection Act.

This test related to a system developed by AI research company Google DeepMind, run by Demis Hassabis and involved the testing and implementation of a novel solution for detecting kidney injuries with the level of accuracy equal to or greater than human doctors. As part of a partnership that would lead to the mobile app Streams, huge amounts of personal data relating to 1.6 million patients was handed to DeepMind, which was then subsequently used whilst testing the app further.

This last point in particular was the biggest issue the Information Commissioner found, as a huge amount of data that had not been anonymised was used as part of the testing process, including information neither necessary nor proportionate to the intended use. Whilst Elizabeth Denham, the information commissioner noted that the aims of the Streams app were positive, but that a patient at accident and emergency who fit within the criteria for the information used “would not reasonably expect” their data to be used for the development of a mobile app.

The Royal Free Hospital Trust have been censured and asked to submit to a third party audit, complete an assessment of privacy and set out how it will comply with its information governance duties, including ensuring the project with DeepMind is continued legally.

Some interesting points came out of this case, most notably involving the live use of Streams and DeepMind themselves. Despite Google, the owners of DeepMind, coming under criticism for its gathering of users data and its applications, the ICO considered DeepMind in this case to be the data processor acting on behalf of the Royal Free, who were the “data controller” legally responsible for upholding the Data Protection Act.

Nevertheless, DeepMind did reflect on the case in a blog post welcoming the ICO’s “thoughtful resolution” of the issue, and explored the need for their work to not only be for clinicians but “accountable to and shaped by patients, the public and the NHS.” DeepMind hoped the result would “guarantee” that patient data would continue to be handled safely and legally.

The case, the app and DeepMind itself are fascinating, and it is interesting how little the ICO concerned itself with the use of the information in the live Steams system and how easy it was for a private company to obtain. In future it is hoped that everyone using sensitive data, whether public or private sector take care to ensure they are following the law and ensuring that information is used safely, proportionately and with regard for people’s privacy, irrespective of the benefits.