Geeta Nayyar, M.D., MBA: It’s clear that health systems have access to tools that can at the very least measure physician burnout, or possibly help us with a cultural change in physician burnout. But what do we think about the future? It’s hard to type, it’s hard to keep up with the medical record. What are some of the future technologies out there that can shift us into a whole different mindset?
Rasu Shrestha, M.D., MBA: I’m really excited about technology. If you look at the evolution of how physicians have interacted with the science and the art of practicing medicine, it’s interesting. We used to walk around with our bags from door to door.
Geeta Nayyar, M.D., MBA: I never did that.
Rasu Shrestha, M.D., MBA: No, but we used to. In the black and white era, that’s what we used to do as physicians.
Geeta Nayyar, M.D., MBA: I had the fanny pack.
Rasu Shrestha, M.D., MBA: Right, and then came Geeta with her fanny pack. But then with the advent of processes, and with the advent of regulations, we started noting things down on paper, which we then put in folders, right? And then came EMRs [electronic medical records]. And we converted this analog culture to a digital culture, actually quite blindly. So now we have e-folders. And we have e-phones, so these are now digital phones. And then what happened in terms of documentation, instead of scribbling notes down on paper, what we started doing was transcribing. Typing things out initially but then transcribing into a phone line, remember that? Transcribing into a phone line and there would be someone else out in the Philippines or the room next door typing out those notes for you.
Heather Staples Lavoie: I was one of those people.
Rasu Shrestha, M.D., MBA: There you go. What we did to put you out of business, out of a job was ….
Heather Staples Lavoie: I moved up.
Rasu Shrestha, M.D., MBA: We replaced it with voice recognition, right? And voice recognition, in many ways, I describe as typing with your tongue. Because what we did when we spoke into those Dictaphone mics was speak out the very words that we then wanted transcribed by the VR, the voice recognition technology. Instead of us typing with our fingers, we’re speaking things out.
And then evolution has happened, going from just regular voice recognition to now leveraging natural language processing (NLP), and the next leap forward, Geeta, to your question there, is artificial intelligence, right? Natural language processing, where it’s looking at the syntax and the context of the specific terminologies, the medications, allergies, the immunizations, problems, labs [laboratory reports], structured documents, the unstructured narrative of my postoperative surgical note, or the radiology report that I’m writing. The NLP, really mining for that information and structuring the information in the documents that I’m trying to create. Pulling out specific codes that might be of relevance, risk stratifying the specifics of the diagnosis that I’m putting in place. Artificial intelligence, that’s where things are really moving toward, coming in and adding additional insight in ways that were just not imagined possible.
Geeta Nayyar, M.D., MBA: So, Rasu, you’re reminding me of when I was a young, new doctor on rounds, and we’d have the attending physicians — they looked a little bit like you, Rasu — and they would have those bags full of different physician reference guides, and I had my BlackBerry. And I remember they would say, “OK, so what do you think this is? What does the rash look like?” And I’d just really quickly look, “Oh well — tick, tick, tick.” So it’s clinical decision support, right? You’re talking a lot about clinical documentation.
Janae Sharp: Yes.
Geeta Nayyar, M.D., MBA: But what about clinical decision support? I’m a rheumatologist, this rash could be five things and I can’t remember the fifth. Where does something like that play into this? Because I would be a lot less burned out if I had some help.
Rasu Shrestha, M.D., MBA: Remember, before the BlackBerry, I had the PalmPilot.
Geeta Nayyar, M.D., MBA: You didn’t mention that before.
Rasu Shrestha, M.D., MBA: No. One of the solutions that I had in the PalmPilot was the diagnostic tool set, right? Remember that?
Geeta Nayyar, M.D., MBA: Yes.
Rasu Shrestha, M.D., MBA: I was able to very quickly find out a DD, a differential diagnosis, with a certain set of symptoms. And it was interesting. It was technology trying to come and play a role in the process of providing care. How that’s evolved today is clinical decision support algorithms that are inherently built into your electronic medical record systems. Add-on solutions that are being piled on to the electronic medical record solutions that bring in context around the longitudinal record; bring the evidence-based guidelines and clinical best practices and protocols and aid in the diagnosis; and interpret the specifics of what we’re actually doing at the point of care. So it’s really interesting to see how it’s evolved.
Janae Sharp: I think for me the real evolution I saw was when a few years ago, if you started searching for suicides, the top thing at the search results for Google was the suicide hotline. And we talk a lot about the point of care, all those things. People are searching online. There’s a reason we call it Dr. Google, and I think the future of care is that Dr. Google is going to have some real doctors back there. So they’ve taken these large technology companies that are already where people are at, like Facebook and Google, and they’re saying, “Look, we need to help these people. We can see that people are sick. They’re searching for ways to kill themselves, so let’s go ahead and have support there.”
Navigate the digital transformation with confidence. Register for our newsletter.
Reduce Physician Burnout