OR WAIT null SECS
Michael Chertoff, the nation’s former homeland security secretary, weighed in on the issue at SXSW.
Image has been cropped and resized. Courtesy of IEEE, SXSW.
The rise of body sensors, networked devices, artificial intelligence, and whatever else Silicon Valley may dream up is bringing medicine to a new point in its growth. Leslie Saxon, MD, called this convergence a “unique handshake” between machines and humans—one that might augment healthcare or foster poor decisions. But a clear danger exists. “Cybersecurity is really the Achille’s heel of this vision,” she said.
A cardiologist who heads the University of Southern California’s Center for Body Computing, Saxon has implanted hundreds of networked pacemakers into patients in her career. She understands the promise of the Internet of Things for medicine, for improving outcomes, research, and beyond. Her work centers on this game-changing tech.
Even so, during a panel at South by Southwest in Austin, Texas, she and 2 colleagues confronted the fears that keep them up at night. Their insights provided lessons for healthcare organizations—and everyone else, save for the folks who live off the grid. (Well, maybe them, too, should they ever require medical care.) The experts’ concerns also spotlighted a glaring question: Who’s responsible for ensuring our connected devices are secure?
Michael Chertoff, former secretary of the US Department of Homeland Security, pointed to “the proliferation of smart objects without rudimentary security features.”
Until recently, he said, society blamed the individual for buying a cheap digital device and connecting it to their network. Any problems that arose were their fault.
But a couple of years ago, a major denial-of-service attack used “tens of millions of devices that had been turned into zombies” to blast its target, a domain registrar, Chertoff said. Seemingly innocuous tech—baby monitors, for instance—were forced into a hacker’s army. Each unit alone didn’t have much computing power, but united they were mighty.
The takeaway: Unprotected devices affect everyone. That holds special weight in healthcare due to the high stakes of, say, a pacemaker being hacked.
Chertoff said necessary guidelines likely won’t be put in place without some movement from regulators or class-action lawyers.
Beau Woods, a cybersecurity expert and Atlantic Council fellow, echoed Chertoff’s concerns and went a step further.
He said plenty of healthcare equipment is fragile, meaning it’s susceptible to “low-skill adversaries.” Some devices, of course, can handle an assault. That list includes implanted devices, which are more difficult to hack. But others, mostly legacy systems in hospitals, are vulnerable, Wood said.
“Devices in use at hospitals don’t take somebody with a lot of prowess to be able to hack them and cause a human-life public-safety nightmare, and that’s what really scares me,” he said. In some cases, people could even accidentally subvert these machines and cause harm.
What might solve this problem? Better designs with security baked into their systems by default, Woods said.
Although Saxon worries about overburdened and overmatched health information technology departments, the creeping thought in her mind has deeper roots in healthcare. “I can only reach those patients who I directly come into contact with, and there are so many unmet needs,” she said. But the same digital tools that hackers target may help change that.