
If not for an anthropologist and sociologist, the leaders of a prominent health innovation hub at Duke University would never have known that the clinical AI tool they had been using on hospital patients for two years was making life far more difficult for its nurses.
The tool, which uses deep learning to determine the chances a hospital patient will develop sepsis, has had an overwhelmingly positive impact on patients. But the tool required that nurses present its results — in the form of a color-coded risk scorecard — to clinicians, including physicians they’d never worked with before. It disrupted the hospital’s traditional power hierarchy and workflow, rendering nurses uncomfortable and doctors defensive.
As a growing number of leading health systems rush to deploy AI-powered tools to help predict outcomes — often under the premise that they will boost clinicians’ efficiency, decrease hospital costs, and improve patient care — far less attention has been paid to how the tools impact the people charged with using them: frontline health care workers.