z-logo
open-access-imgOpen Access
Overcoming Data Scarcity in Human Activity Recognition *
Author(s) -
Orhan Konak,
Lucas Liebe,
Kirill Postnov,
Franz Sauerwald,
Hristijan Gjoreski,
Mitja Lustrek,
Bert Arnrich
Publication year - 2023
Publication title -
2023 45th annual international conference of the ieee engineering in medicine and biology society (embc)
Language(s) - English
Resource type - Conference proceedings
eISSN - 2694-0604
ISBN - 979-8-3503-2447-1
DOI - 10.1109/embc40787.2023.10340387
Subject(s) - bioengineering , engineering profession , general topics for engineers
Wearable sensors have become increasingly popular in recent years, with technological advances leading to cheaper, more widely available, and smaller devices. As a result, there has been a growing interest in applying machine learning techniques for Human Activity Recognition (HAR) in healthcare. These techniques can improve patient care and treatment by accurately detecting and analyzing various activities and behaviors. However, current approaches often require large amounts of labeled data, which can be difficult and time-consuming to obtain. In this study, we propose a new approach that uses synthetic sensor data generated by 3D engines and Generative Adversarial Networks to overcome this obstacle. We evaluate the synthetic data using several methods and compare them to real-world data, including classification results with baseline models. Our results show that synthetic data can improve the performance of deep neural networks, achieving a better F 1 -score for less complex activities on a known dataset by 8.4% to 73% than state-of-the-art results. However, as we showed in a self-recorded nursing activity dataset of longer duration, this effect diminishes with more complex activities. This research highlights the potential of synthetic sensor data generated from multiple sources to overcome data scarcity in HAR.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here