AI, smart speakers bring about a Wild West of health data questions.
When Amazon announced earlier this summer that it would partner with the National Health Service in Britain to provide vetted health information to Alexa users, the deal prompted a mix of reactions. Under the arrangement, Amazon agreed to use NHS resources to answer health-related queries asked of the artificial intelligence platform.
Some in the U.K. praised the deal as a positive step toward ensuring consumers receive accurate information when they search through a smart speaker. Others were quick to raise privacy concerns, worrying about the implications of giving Amazon access to the health data of millions of Britons.
If the prospect of “Paging Dr. Alexa” raises concerns overseas, for customers in the U.S., the questions are equally -- perhaps even more -- complex.
A tech company that receives healthcare information based on a consumer’s internet search generally is not bound by the Health Insurance Portability and Accountability Act (HIPAA), the main privacy law governing healthcare data in the U.S., said Natasha Duarte, a policy analyst at the Center for Democracy and Technology.
“HIPAA does not cover that information, and it’s something that I’m not sure everyone realizes,” Duarte told Inside Digital Health™.
HIPAA covers entities that directly provide care to consumers, such as hospitals and clinics, she said. And it applies to business associates of those providers who might come into contact with patient health data, she added.
However, when a patient discloses health information online or through a search query, neither the search engine nor the source of the resulting health information is bound by HIPAA unless they happen to fit into the provider or business associate categories, Duarte explained.
As in the U.K., Amazon has been looking for ways to expand its healthcare offerings in the U.S. In April, the company announced new partnerships with insurers, providers, and digital health companies to create six new healthcare “skills.” The skills -- sets of concrete tasks like checking on the status of a prescription or providing a hospital with post-surgical status updates -- are HIPAA-compliant, Amazon says. That means the data users disclose when using the specific skills is protected in the same way hospitals and insurers protect other health data. Amazon said it plans to expand the AI platform’s HIPAA-compliant skills.
Still, Alexa’s new skillset covers just a fraction of health-related uses for smart speakers, and Amazon is just one of a number of companies moving into the space.
That means the large majority of health-related information or queries sent through smart speakers are not protected by HIPAA, nor is the information dispensed by the AI platforms necessarily accurate.
If an AI platform by the likes of Amazon, Google, and Apple were to create information partnerships with exclusive trusted providers of health advice (as in the U.K.), Duarte said it would raise questions about how much control users have over the sources of their healthcare information.
Such an arrangement would need to be highly transparent, she said.
“So if whenever you make a health query your information is all coming from one source or at least one source that’s aggregating information, that should be transparent to people because it’s a departure from the way that a search engine would normally work,” she said.
Determining exactly what counts as a health question is another issue, Duarte said.
“If someone is looking for information on diet, is that a query about a health issue or health condition that should go to NHS or is it more of a lifestyle issue if people are interested in a particular diet?” she said.
The answer to that question could have implications for where an artificial intelligence platform directs the question, as well as for the privacy of the person asking it, experts told Inside Digital Health™.
Outside of the AI ecosystem, computer and smartphone users access a wide range of health information without any protection from HIPAA. Some of that online health information is provided directly by healthcare providers, a growing number of whom have created customer-facing health information sites.
Abner Weintraub, a consultant who advises healthcare and technology companies on HIPAA compliance, said when a provider creates a health information website, they are obliged to abide by HIPAA regulations.
“In general, the answer is yes they do have to be more careful [with patient data],” he said.
Healthcare providers provide care to patients, and thus, unlike media companies, they fall under HIPAA’s umbrella.
But that doesn’t always happen in reality, Weintraub said. In most of those cases, providers evade HIPAA requirements by asking users to accept lengthy user agreements that waive the user’s right to health data protections.
“They’re basically getting out from under HIPAA to a large degree by putting in the fine print sections that essentially waive the patient’s rights to privacy,” he said.
Weintraub believes this is exploitative and deceptive.
That workaround could theoretically transfer to an AI environment, further complicating the transparency issues raised by Duarte.
If there’s a fix for the problem, regulators in the U.S. have yet to find it. Weintraub said only two jurisdictions, California and New York, have passed strict state-level data protections. California’s is the tougher of the two, but both require a broader spectrum of entities to keep private the sensitive personal information of residents of the states.
If regulators tackle the issue at a federal level, they will need to confront a handful of key questions. One question, Duarte said, is whether information should be treated differently when it comes through a smart speaker as opposed to a search query on a computer or tablet. She noted that some technology companies have different privacy policies for the different platforms.
Weintraub said he thinks regulators should consider information from smart speakers as even more sensitive than laptop of mobile phone searches, given that the queries take place in the user’s home and given the technical capability of these devices to record audio and obtain information even when the user isn’t asking a question. The so-called “Castle Doctrine” gives homeowners additional rights to self-defense when they are subject to a crime in their own home as opposed to out in public, Weintraub noted.
“The Castle Doctrine doesn’t apply to health information, but I bring it up as an example because state and federal laws all acknowledge in one way or another that you have special rights within your home that don’t exist outside the home,” he said.
Duarte said the swiftest action will likely come at the state level, where states like Washington and Maine have been working on legislation that would bolster privacy protections.
“So there’s momentum in the states,” she said. “Federally there’s momentum, it’s just a little bit less clear how quickly things can or will move.”
In the meantime, Duarte said it’s not practical to ask users not to use the internet to look up health information, particularly in an era where patients increasingly are being asked to act as consumers and comparison shop.
Weintraub said users should consider using privacy-focused search engines like Startpage and Duck Duck Go, and also close their browsers and clear their caches often. Aside from privacy-protected search engines, he said, people need to use the internet with the realization that their data might not be kept private.
“Treat all searches that don’t happen [through a privacy-focused browser] as public information,” he said. “That’s not going to be easy for people to digest or hear. That’s not what the people want to hear. That’s disturbing information.”
Get the best insights in digital health directly to your inbox.
Related
Can Amazon Alexa Increase Patient Engagement?
AI Model Passively Detects Cardiac Arrest through Smart Speakers
Podcast: Amazon Alexa and the Potential of Voice Assistants for Healthcare