z-logo
open-access-imgOpen Access
A Multimodal Deep Learning Architecture for Estimating Quality of Life for Advanced Cancer Patients Based on Wearable Devices and Patient-Reported Outcome Measures
Author(s) -
Muhammad Salman Haleem,
Vasilis Aidonis,
Eleni I. Georga,
Maria Krini,
Maria Matsangidou,
Angelos P. Kassianos,
Constantinos S. Pattichis,
Miguel Rujas,
Laura Lopez-Perez,
Giuseppe Fico,
Leandro Pecchia,
Dimitrios I. Fotiadis,
Gatekeeper Consortium
Publication year - 2025
Publication title -
ieee journal of biomedical and health informatics
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 1.293
H-Index - 125
eISSN - 2168-2208
pISSN - 2168-2194
DOI - 10.1109/jbhi.2025.3597054
Subject(s) - bioengineering , communication, networking and broadcast technologies , components, circuits, devices and systems , computing and processing , signal processing and analysis
Monitoring of advanced cancer patients' health, treatment, and supportive care is essential for improving cancer survival outcomes. Traditionally, oncology has relied on clinical metrics such as survival rates, time to disease progression, and clinician-assessed toxicities. In recent years, patient-reported outcome measures (PROMs) have provided a complementary perspective, offering insights into patients' health-related quality of life (HRQoL). However, collecting PROMs consistently requires frequent clinical assessments, creating important logistical challenges. Wearable devices combined with artificial intelligence (AI) present an innovative solution for continuous, real-time HRQoL monitoring. While deep learning models effectively capture temporal patterns in physiological data, most existing approaches are unimodal, limiting their ability to address patient heterogeneity and complexity. This study introduces a multimodal deep learning approach to estimate HRQoL in advanced cancer patients. Physiological data, such as heart rate and sleep quality collected via wearable devices, are analyzed using a hybrid model combining convolutional neural networks (CNNs) and bidirectional long short-term memory (BiLSTM) networks with an attention mechanism. The BiLSTM extracts temporal dynamics, while the attention mechanism highlights key features, and CNNs detect localized patterns. PROMs, including the Hospital Anxiety and Depression Scale (HADS) and the Integrated Palliative Care Outcome Scale (IPOS), are processed through a parallel neural network before being integrated into the physiological data pipeline. The proposed model was validated with data from 204 patients over 42 days, achieving a mean absolute percentage error (MAPE) of 0.24 in HRQoL prediction. These results demonstrate the potential of combining wearable data and PROMs to improve advanced cancer care.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom