Commissioner says hospital trust moved on wrong side of data protection law in sharing patient details with DeepMind for development of kidney disease alert system
The Information Commissioner’s Office (ICO) has come down against a hospital trust for its release of patient data to an artificial intelligence company for technology trials.
It said the Royal Free NHS Foundation Trust was responsible for several shortcomings in how the data was handled in its dealing with DeepMind, including that patients were not adequately informed that their data would be used as part of the test.
The trust has been asked to sign an undertaking to change procedures in line with the law. These include establishing a proper legal basis under the Data Protection Act (DPA) for projects with DeepMind, and setting out how it will comply with its duty of confidence to patients.
It will also have to complete a privacy impact assessment, and commission an audit of the trial, the results of which will be shared with the ICO, and which the latter will have the right to publish.
Information Commissioner Elizabeth Denham said: “There’s no doubt the huge potential that creative use of data could have on patient care and clinical improvements, but the price of innovation does not need to be the erosion of fundamental privacy rights.
“Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the trust could and should have been far more transparent with patients as to what was happening.”
Kidney alert
In November of last year the Royal Free announced a five-year deal with DeepMind, following a year-long partnership, to develop digital solutions for healthcare. It provided personal data on around 1.6 million patients as part of a trial to test an alert, diagnosis and detection system for acute kidney injury; but this was found to have gone against the terms of the DPA.
The incident prompted Denham to outline lessons from the case through a blogpost. She said that there is a big potential for the creatives uses of data in patient care, but that there is a need for caution in four areas.
First is that such issues are not down to a choice between privacy and innovation; the former should not be sacrificed for the latter. Second is that privacy impact assessments are a crucial part of digital innovation and should becarried out for all such projects.
Third is that it is not always appropriate to use cloud processing technologies; and fourth that it is necessary to know the Data Protection Act and follow it.
“Get this right from the start and you’ll be well placed to make sure people’s information rights aren’t the price of improved health,” Denham said.
Data guardian welcome
The ICO decision was welcomed by the national data guardian for the NHS, Dame Fiona Caldicott. She said her panel had been liaising with the ICO in its investigation and that she shared the view that the Royal Free had not used an appropriate legal basis for data sharing.
“I am keen to see new technologies developed, tested and used to deliver better, safer, more accurate and timely care to patients,” she said. “But I am also very clear that where patient data is used, it is essential that this is done transparently, in line with the law and regulatory frameworks.
“I concur with the points that the ICO has made that much more should have been done to inform patients about the project and to allow them to opt out of their data being used to develop and test the technology if they were not happy for it to be used for this purpose.”
Image from ICO, Open Government Licence v3.0