Will physicians always be the ultimate arbiters of patient care, or does their humanness leave aspects of their workflow better served by artificial intelligence?
Artificial intelligence (AI) is already making waves in healthcare. But many physicians feel that there should be strict boundaries between humans and machines, and that the intrinsic humanness of the doctor-patient relationship is here to stay. Still others believe that AI will completely disrupt the practice of medicine, and that practicing physicians should make way for the machine intelligence that could one day replace them.
Here's where our expert panelists stand on the issue.
A Healthcare Analytics News® Peer Exchange®
Kevin R. Campbell, MD: Let me mention a little something here. The FDA has recently put an emphasis on the development of AI technologies. What do you think the FDA or the government’s role is in promoting this? Scott Gottlieb recently said, “I’m going to smooth out the pathway so that AI-type medical devices and medical technologies can find an easier path through the FDA.” What do you guys think?
Geeta Nayyar, MD, MBA: From my point of view, regulation is always trying to catch up to innovation, right? I think there’s a fine balance of innovate-innovate-innovate and then regulate-regulate-regulate. Telemedicine is a great example. We know it makes sense, we know it works, but the regulation is holding it up. With AI, I think the biggest fundamental issue is going to be trust. We’ve seen all the things about Alexa, right? This is really personal information. A mental health history, a sexual history, a cancer history. There are employment implications, there are family implications, and there are financial implications. I agree with you, John, that it’s about scaling the conversation and taking those algorithms. But fundamentally, I don’t want to talk to Alexa about my cancer diagnosis. I want to talk to a person. Humanity is not dead.
Kevin R. Campbell, MD: It could be supplemental, right. It’s supplemental.
John Nosta, BA: There are 8000 oncology publications a day: 8000 a day. Go ahead, you be my oncology expert. I want the computer. Today, I want the computer to augment my physicians input.
Geeta Nayyar, MD, MBA: I’m not smarter than Alexa, but you can trust me.
John Nosta, BA: It’s IA, not AI.
Geeta Nayyar, MD, MBA: But you can trust me.
John Nosta, BA: For a few years, just a few years.
Geeta Nayyar, MD, MBA: But you can trust me, right? I’m not pretending to be smarter than Watson and Alexa, machine learning, Avenel, or whatever it might be. But there’s still that trust and there’s also HIPAA. There are things that keep you accountable.
Kevin R. Campbell, MD: But there’s a lot to be said about taking 5 minutes when you’re making rounds on 50 patients in the cardiovascular ward and sitting on the end of that bed and holding that patient’s hand and saying, “I understand you’re scared, yes. We’re going to have to put in a pacemaker defibrillator.”
John Nosta, BA: Oh, come on Kevin. And candlelight is really romantic.
Kevin R. Campbell, MD: I’m sorry, I’m a doctor and a physician and a healer first, and I use technology to make me better at what I do. But technology will never replace empathy, caring, love, and concern for our patients.
John Nosta, BA: I think that you’re barking up the wrong tree.
Geeta Nayyar, MD, MBA: John, how many kids to you have?
John Nosta, BA: Three.
Geeta Nayyar, MD, MBA: Three. Would you like them to be delivered by Watson?
John Nosta, BA: No, not Watson, but maybe Watson 3.0. Absolutely. Come on, da Vinci Surgical System, robotics, prostate therapy…
Geeta Nayyar, MD, MBA: Your wife may not agree with you.
John Nosta, BA: My wife of course is not going to agree with me.
Kevin R. Campbell, MD: Let me give you the perfect example. Da Vinci was sold to my hospital as a way to put left ventricular leads in patients who needed a Bi-V (biventricular pacing). Do you know what happened? 90% of the time we dropped the lead, had to open the thoracotomy, and do it that way. Da Vinci was designed for that purpose but turned out to be fantastic for prostate disease.
John Nosta, BA: People die in driverless cars. Should we run from driverless cars?
Kevin R. Campbell, MD: No, I’m not suggesting that. But I’m saying that if you get into a world where technology is all you have, and you take away the humanness of medicine, you’ve destroyed the construct of the doctor/patient relationship.
John Nosta, BA: Kevin, why do you make humanness the standard of excellence?
Kevin R. Campbell, MD: Because that’s what doctors do. We care.
John Nosta, BA: Today.
Kevin R. Campbell, MD: We should always care.
Geeta Nayyar, MD, MBA: I agree with Kevin.
Kevin R. Campbell, MD: Otherwise there is no art of medicine, such as if you have sat on a patient’s bedside and saw the fear in their eyes. I can use Watson to tell me what chemotherapy drug to give at what rate, but connecting as a human, that’s what’s so important. That’s why I became a doctor.
Jane Sarasohn-Kahn, MA, MHSA: I have to interject with a fact here, which ties this together.
John Nosta, BA: Don’t confuse me with facts. I’m not interested in that.
Geeta Nayyar, MD, MBA: You must be an economist.
Jane Sarasohn-Kahn, MA, MHSA: I am, so I’m all about the data. This morning in my blog, “Health Populi,” I wrote about a new study that just came out, a global study in 10 countries. It was from Axiom, the big data company, retail data, and the DMA in the United Kingdom, which is now owned by the Association of Newspaper Advertisers. Here’s what the study asked people around the world: “Who do you trust to hand over your personal information to?” It was not health information, the question was, “Who do you trust?” The number 1 organization type was doctors—in the United States, way more doctors—followed by banks, retail grocers, and government toward the bottom.
Kevin R. Campbell, MD: Facebook wasn’t mentioned? I’m kidding.
Geeta Nayyar, MD, MBA: It wasn’t the pharmaceutical industry, Humana, or payers?
Jane Sarasohn-Kahn, MA, MHSA: No. It was doctors above retailers, and you know how much I love retail health and my work. That’s the nexus of trust for data. We have to remember, AI is hungry for data, and the right data: It’s not just in the EHR electronic health record, it’s the sex, drugs, and rock and roll data. It’s talking about wanting to get on the floor with grandchildren, wanting to travel or golf: not just what’s in the Allscripts or Epic records.
We need patients to trust, providing all of these data, the social determinants of health, along with the clinical data, so that AI can be as rich as it can be. Because if AI is just about the stuff in the EHR, we’re not going to really get to the heart of the patient. I totally agree that the art of medicine is part of the healing and the outcome.
Geeta Nayyar, MD, MBA: Jane, let me be clear. I agree with you and Kevin, and I disagree with John. It goes back to the first question, which is that patients are going to the doctor for access, for a conversation, and for a solution to their problem. AI has a role, but it’s not the role. They’re not going there for the technology. It has a role, and a very important role, to your point. But I go to the pediatrician with my 6-year-old because we have a question, we have a problem. If I can text my doctor, if my doctor uses an amazing computer in his chart, excellent. But I’m going to the doctor to see the doctor, that’s it. And I’m a doctor.
John Nosta, BA: Look, so patients put doctors on top of their list. You can ask anybody from Steve Jobs, who said, “Don’t ask customers what they want,” to the famous quote of Henry Ford, which is probably incorrect, “If I asked my customers what they wanted, they’d say, ‘Get me a faster horse.’” I don’t think in the context of innovation and disruption this meager human-centric patient perspective is going to drive the change. It’s going to let us wallow in mediocrity. I think you’re way off base.
Get the best insights in healthcare analytics directly to your inbox.