
Choosing the right time granularity for analysis of digital biomarker trajectories
Author(s) -
Wakim Nicole I.,
Braun Thomas M.,
Kaye Jeffrey A.,
Dodge Hiroko H.
Publication year - 2020
Publication title -
alzheimer's and dementia: translational research and clinical interventions
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 2.49
H-Index - 30
ISSN - 2352-8737
DOI - 10.1002/trc2.12094
Subject(s) - granularity , computer science , data mining , missing data , digital data , data science , artificial intelligence , machine learning , telecommunications , transmission (telecommunications) , operating system
The use of digital biomarker data in dementia research provides the opportunity for frequent cognitive and functional assessments that was not previously available using conventional approaches. Assessing high‐frequency digital biomarker data can potentially increase the opportunities for early detection of cognitive and functional decline because of improved precision of person‐specific trajectories. However, we often face a decision to condense time‐stamped data into a coarser time granularity, defined as the frequency at which measurements are observed or summarized, for statistical analyses. It is important to find a balance between ease of analysis by condensing data and the integrity of the data, which is reflected in a chosen time granularity. Methods In this paper, we discuss factors that need to be considered when faced with a time granularity decision. These factors include follow‐up time, variables of interest, pattern detection, and signal‐to‐noise ratio. Results We applied our procedure to real‐world data which include longitudinal in‐home monitored walking speed. The example shed lights on typical problems that data present and how we could use the above factors in exploratory analysis to choose an appropriate time granularity. Discussion Further work is required to explore issues with missing data and computational efficiency.