Understanding Desired Outcomes and Goals Is Crucial in AI Adoption
By Christopher Cheney
Health systems and hospitals need to understand the outcomes and goals they are trying to accomplish with AI, says Robert Bart, MD, chief medical information officer at UPMC.
“We are past the time that people are saying they want to use AI because it is AI,” Bart says. “You need to make sure as you are examining the use of AI in your health system that it drives the type of clinical outcomes you are trying to achieve. In addition, AI models should have the scale that your organization needs.”
HealthLeaders is conducting its AI in Clinical Care Mastermind program through December. The program brings together nearly a dozen healthcare executives to discuss their AI strategies and offerings. As part of the program, each of the panelists are talking with HealthLeaders about the use of AI in clinical care.
Bart says the process for identifying the desired goals and outcomes of an AI tool depends on the problem you are trying to solve.
“You want to identify your goals and desired outcomes early on when you are evaluating whether you want to adopt an AI tool, whether it impacts clinical care, the efficiency of care delivery, or other aspects that a health system has targeted,” he says.
“The thing about AI is that the algorithms are always learning,” Bart says. “You need to go back and re-examine the algorithms at intervals after you have started using them. You need to make sure that an algorithm provides the type of guidance or insights that were intended at the outset.”
UPMC has established an AI governance council to make sure that the health system is leveraging AI appropriately and in a safe manner, Bart says. The council has a dozen members and includes clinicians, technologists, as well as diversity and equality staff.
“We have governance to make sure we are understanding the algorithms, understanding the use cases, and understanding the test data that the algorithms were generated on,” he says. “All of those things go into making sure we are leveraging AI in a safe manner to enhance care delivery.”
Diversity and equality are also key issues for AI governance.
“There are some concerns that AI has algorithms that create segmentation of the patient population inappropriately,” Bart says. “We want make sure that AI can be leveraged for all of the patients we serve at UPMC.”
Clinical care AI models at UPMC
UPMC was an early adopter of predictive algorithms, launching these AI tools about six years ago.
“We have been using AI for predictive algorithms related to length of stay and risk for readmission or hospitalization,” Bart says. “We have been trying to predict and understand those types of parameters for patients.”
The health system is using the technology to be more proactive in care delivery.
“Where we identify patients who are outliers such as high risk for hospitalization or readmission after hospitalization, we are trying to put in interventions to mitigate their risk of hospitalization or readmission,” Bart says.
The health system is also using AI in pathology to identify disease states in tissue slides.
“The AI is prioritizing slides of tissue that might have areas of concern and need to be reviewed,” Bart says. “It allows the pathologist to be more efficient with their time.”
In a similar use case, UPMC is using AI in radiology.
“One of the AI tools we are using in the radiology space is related to stroke detection,” Bart says. “For strokes, seconds and minutes make a difference for the brain in the type of care you are delivering. So using AI in image analysis helps improve the speed with which our neurologists and neurosurgeons can make decisions for patients who have brain tissue at risk.”
As is the case with most health systems and hospitals that have adopted AI tools for clinical care, UPMC is using ambient listening AI technology to record conversations between clinicians and patients, then generate clinical documentation.
Compared to the other AI tools that the health system has adopted, the ambient listening AI technology has had the most profound impact on clinicians, according to Bart.
“We were one of the early adopters of ambient voice technology in documentation,” he says. “Our deployment is broad compared to what I have seen in other health systems. The feedback that we get from the clinicians who have adopted this tool is very good.”
For example, a family practitioner may see 20 patients through the course of a day in a clinic, then face “pajama time.” They may finish their office day at 5 or 6 p.m., go home and see their family, then finish the documentation for the patients they saw over the course of the day.
“One of the things we have seen is that pajama time has decreased significantly; and for some clinicians, pajama time has gone away completely,” Bart says. “For a clinician, it makes a huge difference if they can leave the clinic at 5 or 6 and have all of their documentation done as opposed to having to spend two hours in the evening catching up on documentation.”
Christopher Cheney is the CMO editor at HealthLeaders.