Healthcare By Algorithm: Some Real-World Data

By Chuck Dinerstein, MD, MBA — Apr 01, 2022
Pneumonia is a relatively common problem resulting in antibiotics and the need, in some cases, for hospitalization or intensive care. Can physicians follow guidelines that deliver optimal care and reduce cost? Intermountain Healthcare offers a "yes."
Yale Rosen from USA, CC BY-SA 2.0 , via Wikimedia Commons

Intermountain Healthcare is the most extensive health system in the Utah area. It is a national leader in patient safety. So, it is worth looking at the system they developed to guide emergency room physicians in treating and hospitalizing patients with pneumonia. It is particularly important because Intermountain Healthcare controls 2,700 beds across 24 hospitals in three States, hospitals with beds for as few as 18 and as many as 472 patients.

The study looks at 16 of their hospital emergency departments, where they rolled out an automated program to assess and guide the determination of best treatment for a patient with pneumonia, including whether the care should be outpatient or inpatient; and if hospitalized, located to a med-surg unit or intensive care. The algorithms were based on the American Thoracic Society/Infectious Disease Society of America (ATS/IDSA) pneumonia guidelines.

All patients age 18 or older with a chest x-ray demonstrating pneumonia were eligible for the algorithm, which was available but not mandatory for ED physicians. Over 18 months, data were collected on 4,500 or so patients with a median age of 67, 48 female, and 94% White. Outcomes included appropriate antibiotic selection, placement (out or inpatient), and mortality; a cohort of patients treated before implementing the algorithm served as controls.

The control and treatment groups were not an apples-to-apples comparison. The treatment group was a bit younger, predictive to have less mortality, a bit less chronic obstructive pulmonary disease (COPD), and a touch more comorbidity. Similar enough.  

  • The 30-day all-cause mortality for all patients was reduced from 8.6% to 4.8% - an almost 40% reduction. The reductions in mortality were most apparent for critically-ill patients admitted directly to ICUs from the ED.
  • Appropriate ED prescribing of antibiotic therapy was increased from 80 to 88%. Initiating antibiotic treatment within an hour of arrival is a Medicare quality indicator – so this is a win for patients, and for hospitals being rewarded for quality performance. The mean time from admission to antibiotic treatment was 159 minutes; the algorithm improved that time to 150 minutes, still far short of 60 minutes. While the algorithms may have enhanced the choice and ordering of the medications, they did little to alter the workflow from chart to pharmacy to delivery and administration.
  • More patients were treated as outpatients, a roughly 50% increase from 30% before the implementation to 47% afterward. Among the hospitalized, half as many (13% to 6%) were admitted to the ICU. There was a slight uptick in the hospitalization of those initially treated as outpatients.

Overall, it seems like a win for improving the quality of patient care and tamping down costs. But, the fly in the ointment was—human behavior. First, the utilization of the algorithm by physicians was variable. In Intermountain’s larger hospitals, 69% of eligible patients were stratified by the algorithm; in the “10 smaller, rural hospitals,” the rate dropped to 36%. Changing behavior is challenging, although seeing your colleagues change is helpful.

Second, physicians, rightfully, applied their judgment to the algorithm and not always in a good way. When the algorithm suggested hospitalization and the ED physician chose outpatient care, there was a higher readmission rate and, critically, a higher 30-day mortality (7.2% vs. 5.7% when the algorithm recommended outpatient placement). Trusting the algorithm is not a given. There will be human error. That said, this was a good demonstration of algorithmic power in improving care.

There are a few nuances that the researchers failed to mention that we should consider in rolling out algorithmic care.

  • Despite being an integrated system with a great track record in patient safety, shifting physician behavior was complicated and variable – successfully porting this algorithm into other settings is not guaranteed.
  • Who in medical leadership determines an appropriate set of guidelines and updates it as necessary?
  • Who in medical leadership is responsible for the system's surveillance to assure that it continues to operate as designed and enhances patient safety?
  • Who in the health systems Information Technology services is responsible for maintaining this system?
  • In the event of a lawsuit claiming algorithmic liability, which is inevitable, who will have recorded the thinking behind the algorithm’s earlier versions, and who will be held liable?

At this juncture, each health system and electronic health record provider is left on their own. Who is willing to to ask what counts as “best practices"?

 

Source: A Pragmatic Stepped-wedge, Cluster-controlled Trial of Real-time Pneumonia Clinical Decision Support  Am J Respir Crit Care Med DOI: 10.1164/rccm.202109-2092OC.

Category

Chuck Dinerstein, MD, MBA

Director of Medicine

Dr. Charles Dinerstein, M.D., MBA, FACS is Director of Medicine at the American Council on Science and Health. He has over 25 years of experience as a vascular surgeon.

Recent articles by this author: