z-logo
open-access-imgOpen Access
Pain Level Estimation from Videos by Analyzing the Dynamics of Facial Landmarks with a Spatio-Temporal Graph Neural Network
Author(s) -
Fatemah Alhamdoosh,
Pietro Pala,
Stefano Berretti
Publication year - 2025
Publication title -
ieee transactions on biometrics, behavior, and identity science
Language(s) - English
Resource type - Magazines
eISSN - 2637-6407
DOI - 10.1109/tbiom.2025.3592836
Subject(s) - bioengineering , computing and processing , communication, networking and broadcast technologies , components, circuits, devices and systems
Developing effective and accurate methods for automatic estimation of pain level is vital, particularly for monitoring individuals, such as newborns and patients in intensive care units, who cannot communicate verbally. This paper introduces a novel video-based approach for sequence-level pain estimation that addresses two primary challenges in the existing literature. Firstly, we address privacy concerns in methods that rely on full facial images, which expose patient identities, limiting their applicability in healthcare. Our approach uses facial landmarks that offer insights into facial expressions, while preserving privacy as they do not suffice for personal identification. Secondly, pain is a dynamic state with intensity varying over time. Our approach analyzes temporal features at short-and long-term levels, adapting to continuous frame sequences. In essence, we develop a regression model with two components: 1) A Short-term Dynamics Network, where a Spatio-temporal Attention Graph Convolution Network (STAGCN) extracts short-term features from a spatio-temporal graph constructed with nodes representing facial landmarks extracted from each frame, and 2) A Long-term Dynamics Network, where a Gated Recurrent Unit (GRU) processes the sequence of short-term features to learn long-term patterns across the entire sequence. We validated our approach using the BioVid Heat Pain dataset (Parts A, B, and D) and MIntPain assessing performance in multi-class and binary (pain vs. no pain) classifications. Results demonstrate the approach’s potential, even with partially occluded faces.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom