• Politics
  • Diversity, equity and inclusion
  • Financial Decision Making
  • Telehealth
  • Patient Experience
  • Leadership
  • Point of Care Tools
  • Product Solutions
  • Management
  • Technology
  • Healthcare Transformation
  • Data + Technology
  • Safer Hospitals
  • Business
  • Providers in Practice
  • Mergers and Acquisitions
  • AI & Data Analytics
  • Cybersecurity
  • Interoperability & EHRs
  • Medical Devices
  • Pop Health Tech
  • Precision Medicine
  • Virtual Care
  • Health equity

Amazon's Alexa Really Isn't Ready For Healthcare

Article

A disturbing story shows that AI voice assistants have a way to go before they can handle patient data.

echo privacy,alexa healthcare,amazon data privacy,hca news

Editor’s note: This is a column written by Jack Murtha, senior editor. It reflects his views, not necessarily those of the magazine.

Next time you head to a healthcare-technology conference, be sure to count how many sessions it takes for a speaker to discuss the potential of artificial intelligence (AI) voice assistants in medicine. I bet your tally won’t be very high. And for good reason: Since nearly 50 million adults in the US have access to smart speakers like Amazon Echo and Google Home, the technologies appear to be an invaluable means of patient engagement and telehealth.

But smart speakers—or, at the very least, Amazon’s Echo, better known as Alexa—are not ready to safely handle, store, and share protected health information. In fact, you might want to reconsider saying anything more than a song request in front of these devices.

>> LISTEN: Amazon’s Path of Disruption

No, I’m not a curmudgeon or some sort of Luddite. (In fact, I own and regularly interact with an Echo and an Echo Dot.) But I do keep up on the news, and this morning’s big story from Portland, Oregon, was too disturbing to simply digest and forget.

A couple, it turns out, learned that their Alexa recorded a private conversation without their knowledge and sent it to one of the husband’s employees, in Seattle, Washington, according to KIRO-7. She called the victims to let them know what had happened, proving her claim by describing their recent discussion on hardwood floors. Next, they unplugged their Echo devices, contacted an apologetic Amazon, and confirmed their suspicion.

Somehow, some way, Alexa sent a recording of a private conversation to a random name in a man’s contact list.

Just take that in.

Really, spend 10 seconds thinking about this violation of privacy—and the problems it might have caused had the exposed discussion been about something less vanilla than flooring.

OK. So, first, let me acknowledge that just about every expert I’ve encountered has said smart speakers and other types of AI voice assistants aren’t ready to handle sensitive patient data. Some have claimed that the voice-recognition technology needs more work, while others have spent their breath on the outstanding cybersecurity and privacy concerns at play. Both worries are significant.

But for the sake of this column, let’s stick to the obvious issue of privacy. The news that broke today is terrifying on several levels.

For one, we must accept that Alexa is recording us whenever it is plugged in, an assumption that users should have made from the start. We also need to concede that we don’t know who, exactly, has access to this information, or for how long, and in which format. Finally, we must acknowledge that once we speak in front of Alexa, we lose control of that data—the words that came from our minds and mouths.

That’s the deal that every Alexa user makes with Amazon. So, for physicians and patients to agree to use Alexa in their treatment pathways, it would be essential for them to agree to these terms. Although some of these issues already exist in healthcare—which is why data ownership and control is such a hot topic—the presence of a smart speaker might amplify them, turning off potential users. (Plus, some research suggests patients open up more with virtual assistants than doctors or surveys, so the medical information gathered by Alexa might be even more sensitive and graphic than what’s discussed in the doctor’s office.)

That’s the above-board stuff. Now, let’s get to cybercrime and malfunctions.

First, loved ones and acquaintances can manipulate Alexa to snoop on a user. This is exactly how an inquisitive woman caught her boyfriend cheating. While we might be willing to let that instance go, its implications extend beyond the heartbroken lover. It means that people with shared access to a patient or provider Alexa might also have access to health information that is not their own. But I assume developers would target this issue before a smart speaker company made any step into healthcare.

What I don’t assume is that these organizations, even tech giants like Amazon and Google, would be able to protect their users from human malice and error. A recent Wired story highlighted just how simple it was for cybersecurity researchers to essentially wiretap an Alexa. And today’s headline screams that the bright minds behind these AI voice assistants are still working out major bugs.

So, will we ever be confident enough to share embarrassing, intensely personal medical information through these things? Probably, but today is not the day, and neither is tomorrow. Silicon Valley needs to secure these technologies, and it needs to prove that they’re protected, long before they appeal to risk-adverse health systems, clinicians, and patients. The good news is, these companies are thinking about HIPAA, and they don’t seem to be rushing into medicine beyond generalities.

And make no mistake: Healthcare stakeholders are licking their lips, thinking of how they can profit and their patients can benefit from AI voice assistants. Pharma, for instance, sees Alexa as a force for medication adherence. The tech might also help with incentivized surveys and even clinical trials. And its potential for telemedicine is clear.

Sure, data breaches affect brick-and-mortar healthcare organizations just about every day. Their employees sometimes mishandle protected health information and violate HIPAA. Existing cybersecurity protocols are deeply flawed.

But that’s no reason to let smart speaker manufacturers and their app developers off easy. Alexa isn’t in healthcare yet, and it shouldn’t enter healthcare until Amazon eliminates problems like the one that raised eyebrows and turned stomachs today.

Get the best insights in healthcare analytics directly to your inbox.

Related

Overcoming the Cultural Resistance to Health Tech

Paving the Path for Voice-Controlled AI in Pharma

Suki Emerges with $20M to Build an “Alexa for Doctors”

Recent Videos
Image: Ron Southwick, Chief Healthcare Executive
Related Content
© 2024 MJH Life Sciences

All rights reserved.