Open Access
Utilization of Smartphone Depth Mapping Cameras for App-Based Grading of Facial Movement Disorders: Development and Feasibility Study
Author(s) -
Johannes Taeger,
S Bischoff,
Rudolf Hagen,
Kristen Rak
Publication year - 2021
Publication title -
jmir mhealth and uhealth
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.356
H-Index - 50
ISSN - 2291-5222
DOI - 10.2196/19346
Subject(s) - computer science , computer vision , smartphone app , mobile device , mhealth , artificial intelligence , grading (engineering) , physical medicine and rehabilitation , human–computer interaction , psychology , medicine , engineering , psychological intervention , world wide web , civil engineering , psychiatry
Background For the classification of facial paresis, various systems of description and evaluation in the form of clinician-graded or software-based scoring systems are available. They serve the purpose of scientific and clinical assessment of the spontaneous course of the disease or monitoring therapeutic interventions. Nevertheless, none have been able to achieve universal acceptance in everyday clinical practice. Hence, a quick and precise tool for assessing the functional status of the facial nerve would be desirable. In this context, the possibilities that the TrueDepth camera of recent iPhone models offer have sparked our interest. Objective This paper describes the utilization of the iPhone’s TrueDepth camera via a specially developed app prototype for quick, objective, and reproducible quantification of facial asymmetries. Methods After conceptual and user interface design, a native app prototype for iOS was programmed that accesses and processes the data of the TrueDepth camera. Using a special algorithm, a new index for the grading of unilateral facial paresis ranging from 0% to 100% was developed. The algorithm was adapted to the well-established Stennert index by weighting the individual facial regions based on functional and cosmetic aspects. Test measurements with healthy subjects using the app were performed in order to prove the reliability of the system. Results After the development process, the app prototype had no runtime or buildtime errors and also worked under suboptimal conditions such as different measurement angles, so it met our criteria for a safe and reliable app. The newly defined index expresses the result of the measurements as a generally understandable percentage value for each half of the face. The measurements that correctly rated the facial expressions of healthy individuals as symmetrical in all cases were reproducible and showed no statistically significant intertest variability. Conclusions Based on the experience with the app prototype assessing healthy subjects, the use of the TrueDepth camera should have considerable potential for app-based grading of facial movement disorders. The app and its algorithm, which is based on theoretical considerations, should be evaluated in a prospective clinical study and correlated with common facial scores.