Thanks to the Cambridge Analytica scandal, Facebook was forced to put on hold a research programme that involved the company obtaining detailed medical records of patients from several leading U.S. hospitals and matching such data with patients’ Facebook profiles.
Facebook’s data sharing plans
According to Facebook, the research programme was aimed at helping medical professionals develop “specific treatment and intervention plans that take social connection into account”. Here’s a statement that the social media giant released about the research programme:
“The medical industry has long understood that there are general health benefits to having a close-knit circle of family and friends. But deeper research into this link is needed to help medical professionals develop specific treatment and intervention plans that take social connection into account.”
“With this in mind, last year Facebook began discussions with leading medical institutions, including the American College of Cardiology and the Stanford University School of Medicine, to explore whether scientific research using anonymized Facebook data could help the medical community advance our understanding in this area. This work has not progressed past the planning phase, and we have not received, shared, or analyzed anyone’s data.”
“Last month we decided that we should pause these discussions so we can focus on other important work, including doing a better job of protecting people’s data and being clearer with them about how that data is used in our products and services.”
News about the questionable research programme was first revealed by CNBC who noted that while Facebook told the hospitals that personally identifiable information in the patient data obtained from them would be obscured, the company wanted to use a ‘hashing’ technique to match a patient’s medical and social profiles which could then be used by medical professionals to develop specific treatment and intervention plans.
“Facebook’s pitch, according to two people who heard it and one who is familiar with the project, was to combine what a health system knows about its patients (such as: person has heart disease, is age 50, takes 2 medications and made 3 trips to the hospital this year) with what Facebook knows (such as: user is age 50, married with 3 kids, English isn’t a primary language, actively engages with the community by sending a lot of messages).
“The project would then figure out if this combined information could improve patient care, initially with a focus on cardiovascular health. For instance, if Facebook could determine that an elderly patient doesn’t have many nearby close friends or much community support, the health system might decide to send over a nurse to check in after a major surgery,” the CNBC report read.
Hospitals were willing to share patient data
While Facebook’s research programme, that could help the company obtain confidential medical data of people could be termed too intrusive, the American College of Cardiology was in agreement with Facebook’s plans.
“For the first time in history, people are sharing information about themselves online in ways that may help determine how to improve their health. As part of its mission to transform cardiovascular care and improve heart health, the American College of Cardiology has been engaged in discussions with Facebook around the use of anonymized Facebook data, coupled with anonymized ACC data, to further scientific research on the ways social media can aid in the prevention and treatment of heart disease—the #1 cause of death in the world,” said Cathleen Gates, the interim CEO of the American College of Cardiology.
“This partnership is in the very early phases as we work on both sides to ensure privacy, transparency and scientific rigor. No data has been shared between any parties,” she added.
Even though the discussions went on for months, at no time during the process was the treasured principle of patient consent discussed. Even though Facebook told the hospitals that they could provide it anonymised patient data to preserve their privacy, it still had a hashing technique using which it could match medical and social media profiles of individuals without obtaining prior consent.
Google did it too!
However, it would be wrong to state that Facebook is the only global firm willing to bypass consent laws while harvesting additional data of its users. In the UK itself, the Royal Free NHS Foundation Trust earned the ire of the Information Commissioner’s Office after the latter came to know that the trust had provided partial patient records of over 1.6 million patients to Google DeepMind to help the latter ‘develop and deploy a new clinical detection, diagnosis and prevention application and the associated technology platform’ for the former.
The sensitive records belonged to patients who had obtained treatment at the Trust in the previous five years, contained personally identifiable information, and also contained data obtained from the trust’s electronic patient record system.
Even though the trust told the Information Commissioner’s Office that DeepMind used the 1.6 million patient records only for clinical safety tests and for no other purpose, the latter held that the trust failed to adequately inform patients that their data would be used by DeepMind for conducting clinical safety tests.
“Our investigation found a number of shortcomings in the way patient records were shared for this trial. Patients would not have reasonably expected their information to have been used in this way, and the Trust could and should have been far more transparent with patients as to what was happening,” said Information Commissioner Elizabeth Denham.