
I2DNet - Design and real-time evaluation of an appearance-based gaze estimation system
Author(s) -
L. R. D. Murthy,
Siddhi Brahmbhatt,
Somnath Arjun,
Pradipta Biswas
Publication year - 2021
Publication title -
journal of eye movement research
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.25
H-Index - 20
ISSN - 1995-8692
DOI - 10.16910/jemr.14.4.2
Subject(s) - gaze , computer science , artificial intelligence , eye tracking , computer vision , active appearance model , task (project management) , machine learning , pattern recognition (psychology) , image (mathematics) , management , economics
Gaze estimation problem can be addressed using either model-based or appearance-based approaches. Model-based approaches rely on features extracted from eye images to fit a 3D eye-ball model to obtain gaze point estimate while appearance-based methods attempt to directly map captured eye images to gaze point without any handcrafted features. Recently, availability of large datasets and novel deep learning techniques made appearance-based methods achieve superior accuracy than model-based approaches. However, many appearance- based gaze estimation systems perform well in within-dataset validation but fail to provide the same degree of accuracy in cross-dataset evaluation. Hence, it is still unclear how well the current state-of-the-art approaches perform in real-time in an interactive setting on unseen users. This paper proposes I2DNet, a novel architecture aimed to improve subject- independent gaze estimation accuracy that achieved a state-of-the-art 4.3 and 8.4 degree mean angle error on the MPIIGaze and RT-Gene datasets respectively. We have evaluated the proposed system as a gaze-controlled interface in real-time for a 9-block pointing and selection task and compared it with Webgazer.js and OpenFace 2.0. We have conducted a user study with 16 participants, and our proposed system reduces selection time and the number of missed selections statistically significantly compared to other two systems.