The chief innovation officer discusses how the health system leverages AI.
As new innovations enter the hospital, it’s important for executives to stay on top of technologies. Healthcare leaders must learn what’s working well and what’s not in other health systems. Only with that sort of knowledge can C-suites and their teams be confident in their purchasing and implementation decisions.
Tom Castles and I interviewed Daniel Durand, M.D., chief innovation officer of LifeBridge Health in Maryland, to take a deeper look at how he makes decisions about which technology his health system should implement and how LifeBridge is using artificial intelligence (AI).
Durand serves as the chairman of radiology. He earned his medical degree at the Johns Hopkins University School of Medicine. With over 20 years of experience in healthcare, science and technology, Durand’s writing can be found in multiple medical journals.
Editor’s Note: This interview has been slightly edited for length, clarity and style.
Samara Rosenfeld: How has the role of chief innovation officer changed with new innovations?
Daniel Durand: My role was really ushered into existence by the innovation. I think the forces that shaped the creation of the role were a need for health systems to have enterprise-startup interaction. There’s also a lot of digital health funding that is funneling in. Health systems, because they are interacting with so many more startups than they are used to, have to deal with that in some way.
And one of the ways that we decided to deal with that at LifeBridge was to create a special part of the organization called “innovation,” which was bolted onto our previous research group. Within the innovation role, there’s essentially 80% of the innovations that I’m focused on are digital health in nature — so usually things outside of or integrated with the electronic health record. And there are a few things that are device-oriented, but most of it is software and so-called digital health.
S.R.: So what does your day-to-day look like?
D.D: A lot of meetings with new companies and startups. LifeBridge Health has a BioIncubator, so I review proposals to help commercialize laboratory technology. Based on space on campus, I have to find what companies are the right fit for openings. We have companies that are graduating from our BioIncubator that have developed a product here and are moving onto a bigger space, which is the point of an incubator.
Now that companies are graduating, there are more companies that can fill the slots. Part of my role is thinking through if the organization has the highest probability of success or if it’s a company that brings specific medical innovations that we need or our patient population needs strategically. Or should I just take the company that will give LifeBridge the highest percentage of ownership stake for return on investment?
S.R.: How is your health system leveraging AI?
D.D.: I think LifeBridge is pretty typical of a community health system around the country, in the sense that we see a lot of opportunity for AI, but we do not have the AI talent — and few organizations do — to develop that AI on our own. And so we are taking advantage of the many startups that are out there and looking at a wide variety of use cases to see which one fits our criteria the best.
Some of the criteria would include things that align with the total cost of care and value-based care. Another is that we are not opposed to clinical uses of AI, but it is more difficult because they have to be FDA approved. The process for that isn’t that clear, and there aren’t a ton of FDA-approved AI algorithms out there yet. And most of the things we are actually doing is looking at business processes to make a process smarter and identify patients who need a certain care management intervention. We are looking for startups that do these kinds of things on the business transaction side or on the analytics and strategy side because there’s a lot of inefficiency in these areas and it’s not as tightly regulated.
A place like LifeBridge is looking at a huge funnel of conversations with startups. I would say that at least 400 of 500 startups we are talking to are digital in nature. Of those, easily half are going to either claim to be AI or actually be AI. We are looking for the one, two or three to actually implement. Three companies have given us actual FDA-approved devices.
Tom Castles: How do you decide from a business perspective where you want to outsource this stuff and where you want to build the talent on the homefront?
D.D.: As machine learning and AI become their own proper industry, it’s hard for any other industry to keep pace with that. The exceptional talent in those areas are pretty young, and even if they wind up with a Google or an Amazon, sometimes they don’t stay there. They might leave to start their own company or be an early employee in a startup. It’s hard for us to match that. I think people who are really mission-driven might want to work at a health system, but if they want to stay on top of the AI game, they probably need to be surrounded by people who are just doing a lot of the same.
I don’t think a community health system is the right place for that kind of talent to flourish. I would say the same about developer talent. How big does a health system get before it makes sense to have its own in-house shop that develops that? You may find big systems or universities that are just so huge that some of the stuff is done internally. But I think the majority of health systems and even innovations at bigger places will be outsourced. More and more, hospitals are going to need to figure out who their external partners are going to be.
T.C.: At one end of the spectrum, you have people in Silicon Valley who are saying that AI will replace everybody. On the other side, you have end-of-career doctors who are saying this tech is just getting in the way. Where do you stand on the whole digital health question and where do you think it’s most important for your C-level peers to fall on that spectrum?
D.D.: One of the jobs of the C-suite is to make sure that the operating environment is producing the best possible results for patients. And you cannot do that if you’re not also taking care of your workforce and your providers. You have to walk the line and constantly re-evaluate where the line is.
We used to have to write everything out, and that could lead to misreads and safety issues. Now we are in the entirely digital environment, and we have this next set of problems. We are in a situation where we have tons of structured data and people feel constrained by the environment. It’s pretty clear when you talk to doctors of any age that we have to do better than we have been doing. And AI and other forms of automation and doing digital medicine smarter and getting better at it incrementally is what we are all focused on. Any C-suite in healthcare who’s not focused on that is probably living in different world than I’m living in.
The digitization of healthcare absolutely needed to happen. And we are living now with some of the consequences of that. Some people saw it coming, some people didn’t. And the next big step, and AI is part of it, is probably getting the best of both worlds. Getting the personal touch back from the doctor and getting highly structured data that lets us use all of this different technology for the betterment of patients.
A lot changed, a lot needed to change and a lot still has to change.
Get the best insights in digital health directly to your inbox.