Premium
Conflicting roles for humans in learning health systems and AI‐enabled healthcare
Author(s) -
Kasperbauer T. J.
Publication year - 2021
Publication title -
journal of evaluation in clinical practice
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.737
H-Index - 73
eISSN - 1365-2753
pISSN - 1356-1294
DOI - 10.1111/jep.13510
Subject(s) - health care , healthcare system , knowledge management , psychology , artificial intelligence , computer science , medicine , political science , law
The goals of learning health systems (LHS) and of AI in medicine overlap in many respects. Both require significant improvements in data sharing and IT infrastructure, aim to provide more personalized care for patients, and strive to break down traditional barriers between research and care. However, the defining features of LHS and AI diverge when it comes to the people involved in medicine, both patients and providers. LHS aim to enhance physician‐patient relationships while developments in AI emphasize a physicianless experience. LHS also encourage better coordination of specialists across the health system, but AI aims to replace many specialists with technology and algorithms. This paper argues that these points of conflict may require a reconsideration of the role of humans in medical decision making. Although it is currently unclear to what extent machines will replace humans in healthcare, the parallel development of LHS and AI raises important questions about the exact role for humans within AI‐enabled healthcare.