z-logo
open-access-imgOpen Access
Clinically relevant pretraining is all you need
Author(s) -
Oliver J Bear Don't Walk,
Tony Sun,
Adler J. Perotte,
Noémie Elhadad
Publication year - 2021
Publication title -
journal of the american medical informatics association
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.614
H-Index - 150
eISSN - 1527-974X
pISSN - 1067-5027
DOI - 10.1093/jamia/ocab086
Subject(s) - transfer of learning , computer science , task (project management) , domain (mathematical analysis) , artificial intelligence , natural language processing , machine learning , mathematical analysis , mathematics , management , economics
Clinical notes present a wealth of information for applications in the clinical domain, but heterogeneity across clinical institutions and settings presents challenges for their processing. The clinical natural language processing field has made strides in overcoming domain heterogeneity, while pretrained deep learning models present opportunities to transfer knowledge from one task to another. Pretrained models have performed well when transferred to new tasks; however, it is not well understood if these models generalize across differences in institutions and settings within the clinical domain. We explore if institution or setting specific pretraining is necessary for pretrained models to perform well when transferred to new tasks. We find no significant performance difference between models pretrained across institutions and settings, indicating that clinically pretrained models transfer well across such boundaries. Given a clinically pretrained model, clinical natural language processing researchers may forgo the time-consuming pretraining step without a significant performance drop.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here