z-logo
open-access-imgOpen Access
Attitudes of Patients and Health Professionals Regarding Screening Algorithms: Qualitative Study
Author(s) -
Christina Oxholm,
AnneMarie Søndergaard Christensen,
Regina Christiansen,
Uffe Kock Wiil,
Anette Søgaard Nielsen
Publication year - 2021
Publication title -
jmir formative research
Language(s) - English
Resource type - Journals
ISSN - 2561-326X
DOI - 10.2196/17971
Subject(s) - health professionals , meaning (existential) , qualitative research , medicine , psychology , nursing , medical education , health care , social science , sociology , economics , psychotherapist , economic growth
Background As a preamble to an attempt to develop a tool that can aid health professionals at hospitals in identifying whether the patient may have an alcohol abuse problem, this study investigates opinions and attitudes among both health professionals and patients about using patient data from electronic health records (EHRs) in an algorithm screening for alcohol problems. Objective The aim of this study was to investigate the attitudes and opinions of patients and health professionals at hospitals regarding the use of previously collected data in developing and implementing an algorithmic helping tool in EHR for screening inexpedient alcohol habits; in addition, the study aims to analyze how patients would feel about asking and being asked about alcohol by staff, based on a notification in the EHR from such a tool. Methods Using semistructured interviews, we interviewed 9 health professionals and 5 patients to explore their opinions and attitudes about an algorithm-based helping tool and about asking and being asked about alcohol usage when being given a reminder from this type of tool. The data were analyzed using an ad hoc method consistent with a close reading and meaning condensing. Results The health professionals were both positive and negative about a helping tool grounded in algorithms. They were optimistic about the potential of such a tool to save some time by providing a quick overview if it was easy to use but, on the negative side, noted that this type of helping tool might take away the professionals’ instinct. The patients were overall positive about the helping tool, stating that they would find this tool beneficial for preventive care. Some of the patients expressed concerns that the information provided by the tool could be misused. Conclusions When developing and implementing an algorithmic helping tool, the following aspects should be considered: (1) making the helping tool as transparent in its recommendations as possible, avoiding black boxing, and ensuring room for professional discretion in clinical decision making; and (2) including and taking into account the attitudes and opinions of patients and health professionals in the design and development process of such an algorithmic helping tool.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here