z-logo
open-access-imgOpen Access
A Head Pose-free Approach for Appearance-based Gaze Estimation
Author(s) -
Feng Lu,
Takahiro Okabe,
Yusuke Sugano,
Yoichi Sato
Publication year - 2011
Language(s) - English
Resource type - Conference proceedings
DOI - 10.5244/c.25.126
Subject(s) - gaze , artificial intelligence , computer vision , computer science , pose , head (geology) , distortion (music) , rotation (mathematics) , degrees of freedom (physics and chemistry) , eye tracking , geomorphology , geology , amplifier , computer network , physics , bandwidth (computing) , quantum mechanics
T o infer human gaze from eye appearance, various methods have been proposed. However, most of them assume a fixed head pose because allowing free head motion adds 6 degrees of freedom to the problem and requires a prohibitively large number of training samples. In this paper, we aim at solving the appearance-based gaze estimation problem under free head motion without significantly increasing the cost of training. The idea is to decompose the problem into subproblems, including initial estimation under fixed head pose and subsequent compensations for estimation biases caused by head rotation and eye appearance distortion. Then each subproblem is solved by either learning-based method or geometric-based calculation. Specifically, the gaze estimation bias caused by eye appearance distortion is learnt effectively from a 5-seconds video clip. Extensive experiments were conducted to verify the effectiveness of the proposed approach.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom