OR WAIT null SECS
New regulations need to address security concerns and promote ethical use of patient health data.
The future of health data privacy is uncertain, but there seems to be some movement in the right direction. Sens. Lisa Murkowski, a Republican from Alaska, and Amy Klobuchar, a Minnesota Democrat who’s running for president, introduced the Protecting Personal Health Data Act last month, with the goal of closing security gaps in health data treatment for platforms that didn’t exist when HIPAA was created. This bill is a targeted piece of legislation that might end up attached to larger legislation. The importance of the topics addressed in the bill — such as data privacy and personal health apps — is only going to grow stronger.
As companies like Amazon and Facebook venture into the healthcare sphere, we have to ask who is best suited for care delivery. Health data privacy laws and data breaches raise questions — and then there are the ethical concerns about a for-profit retail company entering healthcare delivery. Despite talk about patients being healthcare “consumers,” more selective patients will still require firm data protections. The company with the strongest consumer appeal might not have the strongest health data protection.
The problem is, the Protecting Personal Health Data Act doesn’t guarantee data protections. The bill doesn’t account for massive companies that refuse to adopt ethical data standards without an incentive to do so. There are no existing regulations to protect health data collected by nontraditional health data handlers, such as Facebook and health tracking apps, and many of those companies have shown themselves to be bad agents of patient ethics and data safety. The proposed bill, unfortunately, offers guidance but fails to provide any real means of enforcement.
Regulations without enforcement are the equivalent of censuring a company like Facebook with a strongly worded tweet.
The balancing act of encouraging innovation and protecting patients is not going well. Legislators and the public have taken notice. To understand the Protecting Personal Health Act, it is necessary to look at the report upon which it was based. This report, published in July 2016, contains important findings about health data privacy. “Examining Oversight of the Privacy & Security of Health Data Collected by Entities Not Regulated by HIPAA” addressed gaps in both consumer protection and understanding of health data sharing safety. It didn’t initially gain much attention, but public scrutiny of Facebook’s data practices and data breaches have built on these findings and led to proposed legislation such as the Protecting Personal Health Data Act.
New technologies are changing how patients can manage their health, but these innovations are not regulated in the same way as traditional health data. Consumer-facing, for-profit companies are engaging in data-sharing practices that would be considered unethical or worse if they were a “covered entity” under HIPAA. Health tracking through things like apps, fitness trackers and genetic screening did not exist when health protection laws such as HIPAA were developed. These devices use data for commercial activities outside what most consumers realize, such as in online marketing.
The bill proposes leveling the field for innovators who fall inside and outside the realm of HIPAA. Technologies within the regulatory gap covered by this bill include social media applications such as Facebook, cloud- and mobile-based apps like fitness and fertility trackers and direct-to-consumer genetic testing. The act proposes that we define data sharing for these areas and suggests developing regulatory standards. It would create the National Task Force on Health Data Protection, consisting of 15 stakeholders who would address data deidentification methods, security standards, cybersecurity risks and consumer and employee data issues. The task force would also suggest updates to HIPAA and promote education about personal risks.
The first question we need to ask is, who should be responsible for developing health data safety guidelines for tech companies entering healthcare? Taking a closer look at the proposed legislation also highlights a problem: These technologies did not exist when we created existing health data privacy laws, and the legislative and regulatory bodies that would naturally be responsible for this kind of oversight don’t really exist, either.
The proposed bill suggests that a task force consider different aspects of consumer data sharing for entities not covered by HIPAA. Klobuchar and Murkowski mention that the purpose of data collection should matter — that healthcare stakeholders must “consider appropriate limitations on the collection, use or disclosure of personal health data to that which is directly relevant and necessary to accomplish a specified purpose.” But tech companies have already established a willingness to profit from data sharing and data manipulation with questionable ethics.
Just like how health data privacy laws are lacking, we don’t have an appropriate Senate committee to enforce health data privacy in developing technology that falls outside of traditional established “covered entities.” Klobuchar sits on the Senate Commerce Committee, which oversees general data privacy but has no health data expertise. The bill suggests that the Secretary of Health, Federal Trade Commission, Office of the National Coordinator for Health Information Technology (ONC) and other relevant stakeholders be involved in creating guidelines. Each of these has a role to play, but we are still lacking important aspects to succeed in leveling the playing field.
The ONC was created in 2004 by executive order and mandated in 2009. It is primarily responsible for “coordination of nationwide efforts to implement and use the most advanced health information technology and the electronic exchange of health information,” according to the HITECH Act. The Senate Commerce Committee has authority to step in with sanctions for technology companies, but the ONC does not have authority to fine companies.
The ONC has the clear technical expertise, but for these privacy guidelines to have meaning, the ONC needs to be empowered with enforcement or matched with another group that has such capabilities. Is the ONC the correct group to lead the National Task Force on Health Data Protection? None of the people involved has sufficient power to take on the data challenges and enforce new rules alone.
Standing alone, guidelines are important, but health data privacy will be no better than it was during recent scandals unless Congress cements enforcement.
For healthcare veterans entering the digital health world, the regulations that HIPAA provides serve as an existing guide for all health data. They also make the ethical questions very clear: All digital healthcare data should be protected, regardless of how they are collected. Many healthcare innovators already use a framework that promotes innovation within established HIPAA regulations.
One health-tech vendor that has taken a proactive approach is Omada Health. In June 2019, the company announced that it raised $73 million in funding to expand its digital therapeutics work. For Omada, patient data privacy in an app is just as important as securing protected health information. Patients decide what to share and how other organizations can interact with their digital footprint.
If a technology developer started with patient privacy at the center, they would not sell patient data. And Omada does not sell deidentified data — unlike other apps and direct-to-consumer health companies.
Regulations help narrow the universe of risk that people must understand. As Lucia Savage, J.D., chief privacy and regulatory officer of Omada Health, told me during a discussion of health data bills and security: “There is a big difference between a social media company and a company with HIPAA. If it is within HIPAA you can have a higher degree of confidence, even within digital health.”
For a company entering the digital health space or expanding into fitness trackers, a core understanding of digital information being protected removes confusion and potential ethical concerns. Could tech companies simply adopt the health data standards outlined in HIPAA — without being forced to do so?
“Despite the best efforts of the administration, the FTC and industry, no widely adopted, comprehensive voluntary code of conduct has emerged,” Savage told me.
We know that Facebook has purposefully manipulated emotions and that its attorneys say no one should expect any level of privacy or protection. Regulations are necessary to protect patients because, to this point, big tech has shown no baseline for ethical behavior.
Proposed regulations raise questions. Ethics are necessary, and enforcement remains to be outlined. Healthcare technology experts like those at the ONC might need to shift in their scope of responsibility.
Platforms like Amazon and Facebook, with ambitions to enter healthcare, have potential to improve health, but their background and dual purpose, to market products and sell to people, create ethical problems. Facebook ads know that I really enjoy running and yoga, and the nudge from an ad for Alo Yoga can be a great opportunity for healthcare. We know that Facebook understands what we look like and what our health might look like. Regulating the ethics around the use of that data would be a step toward equitable healthcare.
It is clear that health data privacy regulation needs to evolve to meet the needs of advancing technology. Regulations should create health data privacy protection that ensures patients understand that their data won’t be used against them.
Digital information collected from applications and trackers not covered under HIPAA will see more guidelines and — hopefully — better patient safety. The limits of how patient health data on social media and elsewhere should be used and whether advertising should profit from our sharing remain to be seen.
We need better regulations to encourage ethical innovation.
We need to create regulations and enforce them.
Get the best insights in digital health directly to your inbox.
More on Data Privacy