Opinion|Articles|December 26, 2025

Laying the groundwork for smarter AI in clinical care: 4 essentials for a thoughtful strategy | Viewpoint

Author(s)Angela Adams

Careful planning can improve the chances of reaping the benefits of AI, without piling more on already‑stretched teams.

Healthcare leaders are all‑in on AI. Eighty-five percent are experimenting with or implementing AI tools to improve care and operations, with global spending projected to reach $164 billion by 2030. Yet clinicians are rightly cautious.

In one global survey, only 38% of frontline professionals said current solutions address real clinical needs. Nearly half worry that if AI is deployed poorly, diagnoses could slow (46%) and burnout could rise from added non‑clinical work (46%). The job for hospital and health system leaders is to choose deliberately so AI lightens the load rather than adding to it.

So what’s worth adopting—and how do you roll it out without piling more on already‑stretched teams? Here’s a practical roadmap built around four essentials.

1. Start with high-Impact, proven use cases

Begin where the pain is sharpest and the evidence is strongest: work that drains time and morale. Administrative burden and burnout cost U.S. health systems an estimated $4.6 billion annually. Targeting these burdens first can yield immediate returns. Agentic AI tools are already excelling at tasks like appointment scheduling and follow-up, reminders, insurance pre-authorizations, and documentation, which consume countless hours of staff time.

By offloading that work to agentic AI, clinicians and staff claw back valuable time. At the same time, hospitals and health systems can reduce no-shows, speed up approvals, and, most critically, improve patient outcomes.

2. Balance short-term wins with long-term transformation

Pilots and point solutions matter. But they won’t cure structural problems like fragmented data and aging systems. More than three‑quarters of clinicians report losing clinical time to incomplete or inaccessible patient information.

About a third lose nearly an hour per shift. That adds up to a staggering 23 workdays per clinician each year. In the short term, AI can help by filling some of these gaps (like integrating data from multiple sources or flagging missing info). Ultimately, though, this requires modernizing IT infrastructure and phasing out systems that trap data in silos.

In parallel, modernize data architecture and interoperability so future capabilities have a sturdy foundation. Treat AI as part of a broader digital transformation—not a scatter of apps. With that foundation, comprehensive integration by 2030 could automate much of the administrative load and, in turn, potentially double patient capacity. To reach that future, hospital CIOs and CMIOs must treat AI initiatives as part of a broader digital transformation, not just isolated app deployments.

3. Hire for transformation, not tradition

Technology falters without the right leadership. As health systems expand their AI initiatives, success depends on leaders who can bridge clinical needs with technical innovation and drive process change that leverages the best of human and AI capabilities.

Look for hybrid talent that blends clinical understanding with product sense, data fluency, agile ways of working, and human‑centered design. Consider nontraditional candidates from tech, finance, or other fast‑moving sectors to complement internal clinical expertise, and create roles that can steer AI strategy across silos.

Many organizations are also partnering with technology companies and hyperscale cloud providers rather than building everything in‑house. In one survey, 61% of AI adopters favored third‑party partnerships.

Just as important, stand up governance early. A clear AI governance framework keeps efforts aligned with clinical priorities, sets safety and ethics guardrails, and forces attention to measurable outcomes. Finally, leading organizations are building out high-reliability, AI-enabled systems with the help of collaborative learning programs that bring collective expertise to the challenge.

4. Break through cultural resistance to change

The “culture of no” is real. New tools fail when the people who must use them don’t trust them.

Bring physicians, nurses, and other end users into the process from the start—co‑design workflows, pilot in real settings, and run feedback loops that actually change the product.

Address legitimate concerns head‑on: accuracy, bias, and accountability. More than 75% of clinicians remain unclear about liability when AI errs. Be explicit about validation methods, known limitations, and where human oversight sits in the workflow. Confidence grows when safety nets are visible in daily practice.

Making AI work: A practical path forward

A smarter AI strategy starts small, thinks long, and stays human.

Focus first on clear‑value use cases—especially those that relieve administrative strain and burnout. In parallel, invest in the plumbing: interoperable data, modernized infrastructure, and disciplined governance.

Recruit leaders who can bridge medicine and technology, and equip them to steward change with transparency and rigor. And bring clinicians into the process early so AI becomes a helping hand, not another burden.

Do that, and the benefits arrive in two waves. Near term, you streamline operations, reduce no‑shows and bottlenecks, and return time to caregivers. Over time, you create the conditions to safely adopt more advanced capabilities as they mature because the data pipelines, guardrails, and culture are ready.

There will be skepticism to overcome, processes to re‑engineer, and legacy systems to unwind. But with a grounded plan, AI becomes a practical engine for better patient care and a more sustainable health system—rather than the latest hype cycle.

Angela Adams is CEO at Inflo Health.


Newsletter


Latest CME