The Trouble with Validation and Consent in the Digital Age

“Consumer wearables fall between the cracks of what is already a fragmented and not particularly powerful set of government safeguards."

(Photo credit: Fotolia, Author: domoskanonos)

In theory, most people are concerned with their digital privacy. In practice, the burning itch to engage with shiny new tech may trump that concern. Various experts at a Department of Health and Human Services symposium on digital privacy outlined why that happens and what it means for healthcare.

“The ease of just clicking ‘agree’ makes it very easy for all of us to just continue ignoring terms of use,” said Kirsten Ostherr, director of the Medical Futures Lab at Rice University. Between wearables, mobile devices with health capabilities, and websites or apps that make health promises and collect data, there are a lot “agree” buttons to click.

“Devices are getting less expensive, going from a niche market to a mass market,” said Kathryn Montgomery, PhD, director of the Communication Studies Division at American University. “Wearables are dissolving the line between consumer market and health data system.” She said pharmaceutical companies and healthcare providers are increasingly aligning themselves with device makers to access the powerful data they accrue.

Although that data naturally inform the healthcare system and can be used to better understand patients, it can also be used in marketing or to track their behavior, which is a worry. Montgomery said that comes with many of the natural issues associated with big data aggregation, including the potential for large-scale breaches or misuse.

Not only that, she said, but the market lacked effective regulation.

“Consumer wearables fall between the cracks of what is already a fragmented and not particularly powerful set of government safeguards,” she said. She mentioned that some in the nonprofit healthcare community foresaw the situation and called for regulation years ago, but their warnings hadn’t reached the public consciousness until recently.

Kate Black, JD, and Zerina Curevac, JD, gave a joint presentation at the Data Privacy in the Digital Age meeting that addressed possible solutions. Curevac is a data privacy and cybersecurity associate at Squire Patton Boggs, while Black is a privacy officer and general counsel for 23andMe. Given the depth and volume of health information a genomics company like 23andMe deals in, they spoke of a need to design the site’s interfaces with privacy and consent at the forefront.

Curevac decried the standard privacy policies and terms of use forms that consumers see as jammed with legalese. “This notice and choice model is ineffective in giving the consumers choice in ways they understand,” she said, calling it a “take it or leave it approach.”

Black introduced a 23andMe test, in which consumers were presented with the same information in 2 different formats (1 on a single page, the other on multiple). Later, they recalled how much of the information they retained. “The information retention actually skyrockets,” she said, when patients are given the policies in stages as opposed to a single slab of text.

All those who spoke on the matter indicated that the problems could not be rectified unilaterally by better policy or better design. A multi-stakeholder approach, they said, is imperative.

Ostherr found irony in the public’s willingness to let third parties do as they wished with data, all the while being reticent to agree to clinical trials. “When you’re just agreeing to terms of use, you’re just an anonymous consumer,” she said. But when a researcher describes the collection of data in person, people are more intimidated­—which explains, in turn, the appeal that tech companies and device makers present to healthcare in the first place.