Skip to content

July 3, 2017

DeepMind’s data deal with the NHS broke privacy law

by John_A

An NHS Trust broke the law by sharing patient records with Google’s DeepMind division, the UK’s data watchdog has ruled. The long-awaited decision falls in line with the conclusion drawn by Dame Fiona Caldicott, the UK’s National Data Guardian in May. The pair’s agreement “failed to comply” with the Data Protection Act, according to the Information Commissioner’s Office, because patients weren’t informed that their information was being used. The ICO also took issue with the volume of data — 1.6 million partial patient records — leveraged to test Streams, an app for detecting acute kidney injury and other serious medical issues.

In April 2016, New Scientist revealed that DeepMind and Royal Free London NHS Trust had started working together. As the ICO notes in its letter to the Trust, their agreement was actually formalised in September 2015, with Royal Free serving as the data controller (owner) and DeepMind as the data processor (partner).

Initially, the deal was hashed out so Streams could be further developed by DeepMind and put through clinical safety testing. During this period, the ICO says the Trust — which ultimately takes responsibility, as data controller — broke four principles in the Data Protection Act. The first, which requires processing to be “fair, lawful and transparent,” was broken because the Trust and DeepMind didn’t go far enough in their efforts to contact patients. Common law states, however, that consent can be implied if the data is being used for “direct care.” Tests and full-blown medical use are different, however, so the ICO disregarded this defence.

“The Royal Free did not have a valid basis for satisfying the common law duty of confidence and therefore the processing of that data breached that duty,” the ICO said in its letter to the Trust. “In this light, the processing was not lawful under the Act.”

The second principle, which is the third in the Act, requires data processing to be “adequate, relevant and not excessive.” Following its investigation, the ICO concluded that the 1.6 million records were too much for DeepMind’s purposes. The Commissioner considered the Trust’s arguments, but couldn’t see why the sheer volume of information was needed for the trials. “The Commission is not persuaded that it was necessary and proportionate to process 1.6 million partial patient records in order to test the clinical safety of the application. The processing of these records was, in the Commissioner’s view excessive,” the ICO said.

The Trust’s failure to alert patients broke another principle that states data should be processed “in accordance with the rights of data subjects.” Under section 10 of the Data Protection Act, patients should have the opportunity to remove their information from any data processing. The lack of transparency meant there was no way for patients to know, never mind act on this right. “Put plainly, if the patients did not know that their information would be used in this way, they could not take steps to object.”

The final principle concerns data protection standards. The ICO notes that additional agreements, including a privacy impact assessment, were drawn up in January and November 2016. By this time, however, DeepMind had already processed patient data. The watchdog is content with how DeepMind handles the data, but disagrees with the documentation that was drawn up back in September 2015. It doesn’t go far enough, the ICO says, and could have resulted in a breach of the Data Protection Act. “The Commissioner does, however, recognise that the Royal Free has since improved the documentation in place between the Trust and DeepMind.”

The documentation drawn up last November means that Streams is now on a better legal footing. That’s important, given the app has now passed the testing phase and is being used in hospitals. Nevertheless, the ICO has asked the Trust to commit to a series of changes. These include establishing “a proper legal basis” under the act for the current DeepMind agreement, and any future trials, completing a new privacy impact assessment and commission an audit of the initial Streams trial period.

The ICO has stressed that it doesn’t want to impede innovation. It recognises the work DeepMind does and the impact Streams has already had on the National Health Service. Privacy, however, should not be at the cost of progress, Information Commissioner Elizabeth Denham argues. “What stood out to me [after] looking through the investigation is that the shortcomings we found were avoidable,” she said in a blog post. “The price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights. I have every confidence the Trust can comply with the changes we’ve asked for and still continue its valuable work.”

Source: ICO

Read more from News

Leave a comment

Note: HTML is allowed. Your email address will never be published.

Subscribe to comments