The ICO has ruled that the Royal Free NHS Foundation Trust violated the Data Protection Act when it shared sensitive data of 1.6 million patients with Google DeepMind.
The Royal Free NHS Foundation Trust failed to adequately inform patients on how their data would be used before sharing their data with Google DeepMind.
In September 2015, the Royal Free NHS Foundation Trust entered into an agreement with UK-based Google DeepMind through which the latter would 'develop and deploy a new clinical detection, diagnosis and prevention application and the associated technology platform' for the former.
Basildon Council fined by ICO for disclosing sensitive personal information
Google DeepMind then processed nearly 1.6 million 'partial patient records containing sensitive identifiable personal information' as part of clinical safety testing and to confirm if the technology was safe to deploy during live operations. The sensitive records belonged to patients who had obtained treatment at the Trust in the previous five years, and also included data obtained from the trust's electronic patient record system.
Even though the trust has confirmed to the Information Commissioner's Office that DeepMind used the 1.6 million patient records only for clinical safety tests and for no other purpose, the latter has held that the trust failed to adequately inform patients that their data would be used by DeepMind for conducting clinical safety tests.
The ICO has now ordered the Trust to conduct a privacy impact assessment which will explain how the Trust will comply with the Data Protection Act when entering into an arrangement with DeepMind in terms of processing patient data or conducting clinical safety tests. No financial penalties have been levied on the trust by the Information Commissioner's Office.
35 UK firms served monetary penalties for breaching privacy laws in 2016
“Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening," said Information Commissioner Elizabeth Denham.
“We’ve asked the Trust to commit to making changes that will address those shortcomings, and their cooperation is welcome. The Data Protection Act is not a barrier to innovation, but it does need to be considered wherever people’s data is being used,” she added.
Following its findings in the Royal Free NHS Foundation Trust case, the ICO has now published a list of do's and don't's for the health sector as far as compliance with the Data Protection Act is concerned.
9 surprising things that are illegal under data protection rules
According to the ICO, the 'price of innovation didn’t need to be the erosion of legally ensured fundamental privacy rights' and that deployments of innovative technologies can be completed while adhering to the Data Protection Act. The office also believes that privacy impact assessments should be carried out compulsorily by NHS trusts before handing over sensitive data to third parties.
Also, the fact that cloud-based technologies allow faster data processing doesn't mean hospitals have to resort to them. "NHS organisations, perhaps more than any other sector, need to remember that we are talking about the medical information of real patients. This means you should consider whether the risks to patient privacy are likely to be outweighed by the data protection implications for your patients. Apply the proportionality principle as a guiding factor in deciding whether you should move forward," the ICO said.
"Whether you contact the ICO or obtain expert data protection advice as early as possible in the process, get this right from the start and you’ll be well-placed to make sure people’s information rights aren’t the price of improved health," Denham concluded.
Image Source: Irish Mirror