• Politics
  • Diversity, equity and inclusion
  • Financial Decision Making
  • Telehealth
  • Patient Experience
  • Leadership
  • Point of Care Tools
  • Product Solutions
  • Management
  • Technology
  • Healthcare Transformation
  • Data + Technology
  • Safer Hospitals
  • Business
  • Providers in Practice
  • Mergers and Acquisitions
  • AI & Data Analytics
  • Cybersecurity
  • Interoperability & EHRs
  • Medical Devices
  • Pop Health Tech
  • Precision Medicine
  • Virtual Care
  • Health equity

PX: Is AI a Physician's Tool or Future Physician?


Artificial intelligence (AI) has proven to be a useful tool for physicians. Can AI take their role in the future?

Should artificial intelligence (AI) in medicine be used as tool for a physician, or could AI theoretically replace the physician entirely? Our expert panel weighs in:

Leveraging Big Data and AI in the Hospital of Tomorrow

A Healthcare Analytics News® Peer Exchange®

Segment 5/11

Kevin R. Campbell, M.D.: John, you work with a lot of customers, a lot of companies, and I’m sure that you counsel them on what’s the role of AI in their business. What’s the role of these types of technologies? What would you tell the average healthcare company [about] how they should use AI? What are the downfalls of AI? I know that’s a loaded question.

John Nosta, B.A.: It is a loaded question, but first of all I think that the hype is appropriate. You know we talk about big data, and to me, big data [are] data. It’s the AI— it’s the analytics component that is the spot where we take all of this information. Earlier Dave mentioned the Gartner Hype Cycle; I think that’s a standard tool that looks at about 3,000 points of innovation, and they put AI on top of the hype. So now it’s what everybody is talking about. It’s all hyped up. I guess to a certain degree there may be an overreach in utility of AI in clinical medicine today or experimental medicine.

But the interesting thing is what Gartner said. And I’m using Gartner as an arbiter of mainstream thinking, not fringe, kookie, over-the-top stuff. Gartner said, in the headline to the story, is that today we see the blurring of the distinction between man and machine, and I think that is the sort of buried little secret here that we are seeing. Advanced computation that will actually mimic human reason and thinking. Dave, as you said, you make a mistake and you learn, you make a mistake and you learn, you make a mistake and you learn—that [sounds] kind of like an internship to me, doesn’t it?

Kevin R. Campbell, M.D.: That’s how we train Fellows at Duke.

>> WATCH: Machine Learning: The Future of Medicine?

John Nosta, B.A.: Right, so I think that AI is really becoming part of the human reality. So, I really talk to my clients about understanding the nature of hype—but don’t take your eye off the ball—because this is the fundamental game changer in society today.

Kevin R. Campbell, M.D.: One of the things that I’ve been very impressed with from a regulatory standpoint is that our FDA commissioner, Scott Gottlieb, has embraced AI and automation, and is working very hard with his committees there within the FDA to create a way to quickly approve things like AI in medical devices. How do you think, Geeta, that’s going to impact innovation if we say, now, maybe we can make it easier for you, maybe we can make it less costly, and there’s an actual real pathway to approval with the FDA?

Geeta Nayyar, M.D., MBA: The biggest thing behind AI and machine learning is this idea that all of these patterns, all of these algorithms, looking at a million EKGs [electrocardiograms], will get us to the answer quicker, and we can identify a pattern or a diagnosis earlier and better and even more impactful if we can predict. So from a cost standpoint — from an FDA standpoint — if we can predict that someone is going to have a chronic disease, or is going to have an acute event, and we can make an intervention, we can come up with the treatment regimen, we can come up with a diagnosis that then prevents a future generation from suffering or being a cost to the system or having future morbidity or mortality. The prediction is the prevention and that is the cost-containment aspect to everything we’re trying to do in healthcare—the whole cost of devalue-based care — if AI helps us get there, it helps every stakeholder in the industry, and most importantly, the patient.

John Nosta, B.A.: When my wife comes home from the pediatrician, one of the first things I ask her is, “What did the doctor say?” That’s my point of interest. Tomorrow, I’m going to ask her, “What did the computer say?” And then my next question will be, “And what did the doctor do about that?” I think that the empowerment of technology to help augment clinical practice is going to be profound.

Daniel Kraft talks about artificial intelligence. He flips the letters around. He talks about intelligence augmented and I think that’s the dynamic we’re seeing here, that there is an intrinsic intelligence to a physician that is not infinite. And I think that that intelligence can be augmented through AI and technology to provide a member of the group practice, if I dare go that far.

Geeta Nayyar, M.D., MBA: That’s a good point, John. One of the things I want to fundamentally say, that I think we’re all saying in different ways, and we have two cardiologists here on the panel, right— heart failure is heart failure, an MI [myocardial infarction] is an MI. The tools to diagnose those pathology states have evolved with time. The treatment for MI and CHF [congestive heart failure], they’ve evolved. AI is another tool. The MI is the MI. CHF is CHF. The EKG is AFiB [atrial fibrillation] or not, right? This is about embracing the tools. So, to your point, John, the doctor is still the one doing the treatment, still coming up with the plan of action, but if you have better tools — just like we did open-heart surgeries and now we do a lot more noninvasive cardiac procedures — why not? It’s still the same patient.

John Nosta, B.A.: Yeah, but Geeta, to relegate AI as another tool, subordinates the power of this initiative.

Geeta Nayyar, M.D., MBA: I don’t think so.

Kevin R. Campbell, MD: I would disagree with that. It’s a tool, and I want every tool. I want the fanciest defibrillator. I want the fanciest catheter. I want the best operative tools. I want the best diagnostic tools. If I can have a tool like AI make me smarter, make me more effective, make me able to predict disease rather than treat disease, we’re going to make a difference. But an AI, a machine, cannot sit on the end of a bed, hold a dying woman’s hand, talk to her about the procedure that she’s about to have, empathize with her. So, it has to be both. You have to have doctor and machine. Dave, what do you think about that? I mean we both trained at Duke.

David E. Albert, M.D.: Medicine is more than just a science because you’re dealing with human beings, and I love the notion that human beings should be handled with care, OK? We are complex organisms. So, I see AI — you know, one of the great potentials — we talk about oncology, we talk about cardiology, we talk about dermatology and radiology, but, quite frankly, I think one of the great opportunities is in primary care. My oldest son is a resident in his last year of internal medicine residency, and as he says, “Dad, primary care is algorithmically driven when it’s done well.” Guidelines driven. But you know what, I don’t walk around with the guidelines in my head, and I don’t walk around with the algorithms in my head. But if we have people in a CVS, in a Walgreens, in a Walmart, in a consumer-oriented environment who can be augmented in terms of their effectiveness, diagnostic skills and their triage capability, by augmented intelligence, then I think that’ll be a huge perk. And by the way, coaching; we talked about it, Kevin. We spend all our time treating. We need to be spending more time preventing.

Lifestyle, diet, exercise, not smoking: all of these things are critically important if we’re going to address the issues that we face in America. And, quite frankly, face around the world. I think AI will have a place in all of those because it can take all this data and personalize to you.

John Nosta, B.A.: I just wonder if AI will one day consider the physician a tool to prescribe.

Kevin R. Campbell, M.D.: You’re talking about Terminator level.

John Nosta, BA: Oh, come on. If you want your resource into AI to be Hollywood, yes, you can go that way. But I wanted to bring it into a more scholarly perspective. I would argue that AI, yes, can use the physician as a tool to inject the appropriate amount of empathy.

Kevin R. Campbell, M.D.: Totally disagree.

John Nosta, B.A.: Empathy into the scenario.

Kevin R. Campbell, MD: I don’t think that machine can show empathy. Unless you’re talking about an android that has not been delivered to us yet, there is no machine that can show empathy. That is a human emotion. That’s what makes us, that’s what gives us a soul, gives us a purpose: empathy, caring, compassion. A machine is a machine. My car can’t feel bad because it’s raining. My car just drives through the rain.

John Nosta, B.A.: Well, plot, you know? Draw the data and plot the curve and I think you’ll be very, very interested in some of the potential changes about the context of humanity. And ultimately you have to ask yourself, is AI man’s last great adventure?

Get the best executive insights directly to your inbox. Sign up for our daily newsletter.

Related Videos
Image: Ron Southwick, Chief Healthcare Executive
Jim Adams, AllianceRx Walgreens Prime
Heather Bonome, Pharm.D., URAC
George Van Antwerp, MBA
Related Content
© 2024 MJH Life Sciences

All rights reserved.