Decision Support Tools Necessary and Promising, but Still Lacking Validation

Ryan Black

A new study and 2 accompanying commentaries illustrate both the promises and the pitfalls of decision support.

(sketch of AlertWatch dashboard, from US Patent #8,454,507)

Decision support has grown into a primary area where data analytics can change healthcare. While guidelines and a clinician’s education are traditionally vital to the choices made in patient care, they are based on past best practices and consensus opinions. Decision support tools, on the other hand, aim to make suggestions based on thousands or millions of real-time data points, stripped of subjectivity.

But they are still developing, with questions remaining about what information they should incorporate, what recommendations they should make, and how they can be validated. This month Anesthesiology, the journal of the American Society of Anesthesiologists, explored the topic with a study on the use of 1 such system in surgery, alongside 2 accompanying commentaries.

The research focused on AlertWatch, approved by the FDA in 2014, which provides a real-time dashboard that displays patient vitals and delivers sound and text alerts in case of an important change, like a dangerous dip in blood pressure.

A team led by Sachin Kheterpal, MD, of the University of Michigan Medical School, looked retrospectively at over 26,000 patient electronic health records (EHR), gathered over the course of 6 years. They compared experimental cases, or procedures where the AlertWatch system was used for over 75% or more of the case (about 8,000), with parallel controls where it was used 74% of the time or less (about 11,000) and historical controls that predated the system entirely (about 8,000).

Use of the system “was associated with a risk-adjusted improvement in process-of-care measures among high-risk patients undergoing major inpatient surgery,” like management of lung ventilation. There was no significant impact seen on postoperative outcomes or length of hospital stay.

“Previous efforts at modeling healthcare improvement efforts in the mold of other high-risk, data-driven industries…have typically disappointed,” the authors write. They add that the new data is still encouraging, and tools like AlertWatch are necessary.

As some in healthcare are wont to do, Kheterpal looks to the aviation industry for lessons on healthcare data use.

"Physician anesthesiologists generally need to be aware of 40 different patient data streams at one time during surgery, including blood pressure, ventilation and heart rate,” much the same as an airline pilot must be aware of numerous environmental and mechanical factors during flight, he said in a statement. “These dozens of different data streams need to be integrated effectively to make important decisions,” he added, though the often-fragmented state of EHR data can make that difficult.

The 2 accompanying editorials express concerns about validation and regulation of such systems.

“My concern is that decision support systems in development may easily include a hundred or more alerts," the Cleveland Clinic’s Daniel Sessler, MD, writes in the first. Too many alerts and signals could result in confusion and distraction, leading to worse outcomes rather than better ones. Validation, he argues, is necessity.

But validation is a tricky thing, as Gail H. Javitt, JD, writes in the second commentary. The healthcare attorney notes that the FDA has yet to outline a concrete framework for decision support tools, and much of the data that may be necessary to create improvements in the systems can only be generated from extensive use post-approval.

“Clinicians will need to understand what the devices are or are not intended to do and may wish to consider what role they can play in the postmarket data generation process to establish clinical utility,” she writes, while noting that new initiatives at the FDA under Scott Gottlieb, MD, may provide “much needed clarity.”