See what I Mean? Mobile Eye-Perspective Rendering for Optical See-Through Head-Mounted Displays
Author(s) -
Gerlinde Emsenhuber,
Tobias Langlotz,
Denis Kalkofen,
Markus Tatzgern
Publication year - 2025
Publication title -
ieee transactions on visualization and computer graphics
Language(s) - English
Resource type - Magazines
SCImago Journal Rank - 1.005
H-Index - 144
eISSN - 1941-0506
pISSN - 1077-2626
DOI - 10.1109/tvcg.2025.3616739
Subject(s) - computing and processing , bioengineering , signal processing and analysis
Image-based scene understanding allows Augmented Reality (AR) systems to provide contextual visual guidance in unprepared, real-world environments. While effective on video see-through (VST) head-mounted displays (HMDs), such methods suffer on optical see-through (OST) HMDs due to misregistration between the world-facing camera and the user's eye perspective. To approximate the user's true eye view, we implement and evaluate three software-based eye-perspective rendering (EPR) techniques on a commercially available, untethered OST HMD (Microsoft HoloLens 2): (1) Plane-Proxy EPR, projecting onto a fixed-distance plane; (2) Mesh-Proxy EPR, using SLAM-based reconstruction for projection; and (3) Gaze-Proxy EPR, a novel eye-tracking-based method that aligns the projection with the user's gaze depth. A user study on real-world tasks underscores the importance of accurate EPR and demonstrates gaze-proxy as a lightweight alternative to geometry-based methods. We release our EPR framework as open source.
Accelerating Research
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom
Address
John Eccles HouseRobert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom