z-logo
Premium
Gaze‐driven Object Tracking for Real Time Rendering
Author(s) -
Mantiuk R.,
Bazyluk B.,
Mantiuk R. K.
Publication year - 2013
Publication title -
computer graphics forum
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.578
H-Index - 120
eISSN - 1467-8659
pISSN - 0167-7055
DOI - 10.1111/cgf.12036
Subject(s) - computer vision , computer science , artificial intelligence , eye tracking , rendering (computer graphics) , gaze , robustness (evolution) , fixation (population genetics) , population , biochemistry , chemistry , demography , sociology , gene
Abstract To efficiently deploy eye‐tracking within 3D graphics applications, we present a new probabilistic method that predicts the patterns of user's eye fixations in animated 3D scenes from noisy eye‐tracker data. The proposed method utilises both the eye‐tracker data and the known information about the 3D scene to improve the accuracy, robustness and stability. Eye‐tracking can thus be used, for example, to induce focal cues via gaze‐contingent depth‐of‐field rendering, add intuitive controls to a video game, and create a highly reliable scene‐aware saliency model. The computed probabilities rely on the consistency of the gaze scan‐paths to the position and velocity of a moving or stationary target. The temporal characteristic of eye fixations is imposed by a Hidden Markov model, which steers the solution towards the most probable fixation patterns. The derivation of the algorithm is driven by the data from two eye‐tracking experiments: the first experiment provides actual eye‐tracker readings and the position of the target to be tracked. The second experiment is used to derive a JND‐scaled (Just Noticeable Difference) quality metric that quantifies the perceived loss of quality due to the errors of the tracking algorithm. Data from both experiments are used to justify design choices, and to calibrate and validate the tracking algorithms. This novel method outperforms commonly used fixation algorithms and is able to track objects smaller then the nominal error of an eye‐tracker.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here