Premium
29‐3: An Easy‐to‐Implement and Low‐Cost VR Gaze‐Tracking System
Author(s) -
Sun Jiankang,
Zhang Hao,
Chen Lili,
Zhang Menglei,
Xue Yachong,
Li Xinkai
Publication year - 2021
Publication title -
sid symposium digest of technical papers
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.351
H-Index - 44
eISSN - 2168-0159
pISSN - 0097-966X
DOI - 10.1002/sdtp.14693
Subject(s) - computer vision , computer science , artificial intelligence , gaze , rendering (computer graphics) , pupil , eye tracking , ellipse , eye tracking on the iss , tracking system , software , computer graphics (images) , optics , mathematics , physics , geometry , filter (signal processing) , programming language
The main contribution of this paper is to provide an easy‐to‐implement and low‐cost gaze tracking system for near‐eye display, which can meet the needs of application scenarios such as near‐eye display interaction and foveated rendering. The hardware of the system is an infrared camera combining with an annular infrared light source of 850nm to achieve image acquisition. The software algorithm is based on the pupil segmentation algorithm of sliding window with adaptive threshold, which can achieve the precise pupil segmentation, the pupil ellipse fitting method that can achieve the precise pupil positioning, and the polynomial model that can establish the mapping between the pupil center and the fixation point. Consider the real application environment, the eye position compensation method based on the canthus point is adopted to achieve a more stable gaze calculation which completed a more perfect VR gaze‐tracking system. For the experimental results, the average error for the system is 0.55° in the horizontal direction and 0.63° in the vertical direction, the latency on the Intel I7‐6700HQ CPU is about 3.5 ms, which indicates that the system can achieve the gaze‐tracking calculation with high speed and precision that can be used in VR.