
Post-processing integration and semi-automated analysis of eye-tracking and motion-capture data obtained in immersive virtual reality environments to measure visuomotor integration
Author(s) -
Haylie L. Miller,
Ian Raphael Zurutuza,
Nicholas E. Fears,
Suleyman Olcay Polat,
Rodney D. Nielsen
Publication year - 2021
Publication title -
acm symposium on eye tracking research and applications
Language(s) - English
Resource type - Conference proceedings
DOI - 10.1145/3450341.3458881
Subject(s) - computer science , motion capture , computer vision , artificial intelligence , eye tracking , virtual reality , gaze , motion (physics) , task (project management) , pipeline (software) , match moving , adaptation (eye) , engineering , systems engineering , programming language , physics , optics
Mobile eye-tracking and motion-capture techniques yield rich, precisely quantifiable data that can inform our understanding of the relationship between visual and motor processes during task performance. However, these systems are rarely used in combination, in part because of the significant time and human resources required for post-processing and analysis. Recent advances in computer vision have opened the door for more efficient processing and analysis solutions. We developed a post-processing pipeline to integrate mobile eye-tracking and full-body motion-capture data. These systems were used simultaneously to measure visuomotor integration in an immersive virtual environment. Our approach enables calculation of a 3D gaze vector that can be mapped to the participant's body position and objects in the virtual environment using a uniform coordinate system. This approach is generalizable to other configurations, and enables more efficient analysis of eye, head, and body movements together during visuomotor tasks administered in controlled, repeatable environments.