z-logo
Premium
A systematic framework for on‐line calibration of a head‐mounted projection display for augmented‐reality systems
Author(s) -
Hua Hong,
Gao Chunyu
Publication year - 2007
Publication title -
journal of the society for information display
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.578
H-Index - 52
eISSN - 1938-3657
pISSN - 1071-0922
DOI - 10.1889/1.2812991
Subject(s) - subpixel rendering , computer science , computer vision , augmented reality , artificial intelligence , calibration , projection (relational algebra) , process (computing) , line (geometry) , computer graphics (images) , optical head mounted display , virtual image , pixel , algorithm , statistics , mathematics , geometry , operating system
— Augmented reality (AR) is a technology in which computer‐generated virtual images are dynamically superimposed upon a real‐world scene to enhance a user's perceptions of the physical environment. A successful AR system requires that the overlaid digital information be aligned with the user's real‐world senses — a process known as registration. An accurate registration process requires the knowledge of both the intrinsic and extrinsic parameters of the viewing device and these parameters form the viewing and projection transformations for creating the simulations of virtual images. In our previous work, an easy off‐line calibration method in which an image‐based automatic matching method was used to establish the world‐to‐image correspondences was presented, and it is able to achieve subpixel accuracy. However, this off‐line method yields accurate registration only when a user's eye placements relative to the display device coincides with locations established during the offline calibration process. A likely deviation of eye placements, for instance, due to helmet slippage or user‐dependent factors such as interpupillary distance, will lead to misregistration. In this paper, a systematic on‐line calibration framework to refine the off‐line calibration results and to account for user‐dependent factors is presented. Specifically, based on an equivalent viewing projection model, a six‐parameter on‐line calibration method to refine the user‐dependent parameters in the viewing transformations is presented. Calibration procedures and results as well as evaluation experiments are described in detail. The evaluation experiments demonstrate the improvement of the registration accuracy.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here