z-logo
Premium
Accurate Real‐time 3D Gaze Tracking Using a Lightweight Eyeball Calibration
Author(s) -
Wen Q.,
Bradley D.,
Beeler T.,
Park S.,
Hilliges O.,
Yong J.,
Xu F.
Publication year - 2020
Publication title -
computer graphics forum
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.578
H-Index - 120
eISSN - 1467-8659
pISSN - 0167-7055
DOI - 10.1111/cgf.13945
Subject(s) - gaze , computer vision , computer science , artificial intelligence , retargeting , monocular , eye tracking , calibration , rgb color model , tracking (education) , ground truth , mathematics , psychology , pedagogy , statistics
Abstract 3D gaze tracking from a single RGB camera is very challenging due to the lack of information in determining the accurate gaze target from a monocular RGB sequence. The eyes tend to occupy only a small portion of the video, and even small errors in estimated eye orientations can lead to very large errors in the triangulated gaze target. We overcome these difficulties with a novel lightweight eyeball calibration scheme that determines the user‐specific visual axis, eyeball size and position in the head. Unlike the previous calibration techniques, we do not need the ground truth positions of the gaze points. In the online stage, gaze is tracked by a new gaze fitting algorithm, and refined by a 3D gaze regression method to correct for bias errors. Our regression is pre‐trained on several individuals and works well for novel users. After the lightweight one‐time user calibration, our method operates in real time. Experiments show that our technique achieves state‐of‐the‐art accuracy in gaze angle estimation, and we demonstrate applications of 3D gaze target tracking and gaze retargeting to an animated 3D character.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here