OR WAIT null SECS
The social media giant's now-halted pitch to health systems is actually somewhat similar to a health intitiative it already runs...with one key difference.
CEO Mark Zuckerberg on stage at Facebook's F8 event in 2014. Image courtesy
Wikimedia Commons user Maurizio Pesce.
People share a lot on Facebook—at least, they did before recent events torpedoed public trust in the company. Data on age, location, offspring, and even indications about moral and dietary preferences all help fill out a detailed look at who a person is. And if paired with their health data, the combination could be truly powerful.
Facebook wanted to bring health data—and potentially protected health information (PHI)—into the fold, according to a recent story from CNBC. It reportedly sent a doctor to try to convince some large health systems to share de-identified patient health data with the social media giant, which it could then link to its own dizzying array of consumer data points. The company even said it was in talks with the American College of Cardiology and the Stanford University School of Medicine.
The plan was meant to focus on cardiac risks: The company pitched health systems on its desire to gauge whether it could credibly determine the social support systems in place for patients who are at elevated risk of a cardiac event. If a patient didn’t have many friends or family nearby, the social media company could pass that along to the health system, who could then decide if they wanted to provide interventions, like sending along a nurse to check in.
Facebook already does something similar for suicide prevention. In November 2017, it announced that its web of learning algorithms could comb through user messages—and even Facebook Live content—in hopes of finding people who might be suicidal. It said that in the preceding month, the system had been used to notify first responders more than 100 times that someone in their locality might be at risk.
In that case, however, there is no PHI being shared. In this newly revealed plan, Facebook apparently suggested linking individuals in their database with those in the healthcare data sets through cryptography, allowing both the health institutions and the social network to see those the connections.
But the ongoing Cambridge Analytica scandal apparently has it all on hold—and even if not for that, there seem to be flaws that might have kept it at bay. Even if health systems agreed to go along, they’d be creating a situation likely to draw regulatory attention. Were Facebook to be given access to PHI, dizzying questions then arise over patient consent, unauthorized access, and who would be liable if a breach occurred.
Certainly, these concerns face all of the tech companies rushing into healthcare. Facebook’s suicide intervention technology can’t be used in the European Union due to data privacy laws there. Uber, domestically, had a separate firm perform a risk analysis on its Health Insurance Portability and Accountability Act (HIPAA) compliance almost a year before it announced its Uber Health branch.
“This work has not progressed past the planning phase, and we have not received, shared, or analyzed anyone's data," Facebook said in its statement to CNBC. "Last month, we decided that we should pause these discussions so we can focus on other important work, including doing a better job of protecting people's data and being clearer with them about how that data is used in our products and services."
In the meantime, healthcare is already learning how to make good use of social media on its own, even without formal partnerships or potential PHI nightmares. Twitter, debatably Facebook’s competitor, has been a goldmine for mental health researchers in recent years. That platform collects less demographic and contextual data than Facebook. But researchers have been able to conduct credible studies on post-traumatic stress disorder (PTSD), depression, and attention deficit/hyperactivity disorder (ADHD) based mostly on users’ publicly-available messages…although still, an author on one of those studies once told Healthcare Analytics News™ that “the whole thing has a creepy aspect to it.”