z-logo
Premium
A new approach to outdoor illumination estimation based on statistical analysis for augmented reality
Author(s) -
Liu Yanli,
Qin Xueying,
Xing Guanyu,
Peng Qunsheng
Publication year - 2010
Publication title -
computer animation and virtual worlds
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.225
H-Index - 49
eISSN - 1546-427X
pISSN - 1546-4261
DOI - 10.1002/cav.357
Subject(s) - computer science , rendering (computer graphics) , image based lighting , computer vision , artificial intelligence , augmented reality , consistency (knowledge bases) , computer graphics (images) , virtual reality , coherence (philosophical gambling strategy) , image based modeling and rendering , mathematics , statistics
Illumination consistency plays an important role in realistic rendering of virtual characters which are integrated into a live video of real scene. This paper proposes a novel method for estimating the illumination conditions of outdoor videos captured by a fixed viewpoint. We first derive an analytical model which relates the statistics of an image to the lighting parameters of the scene adhering to the basic illumination model. Exploiting this model, we then develop a framework to estimate the lighting conditions of live videos. In order to apply the above approach to scenes containing dynamic objects such as intrusive pedestrians and swaying trees, we enforce two constraints, namely spatial and temporal illumination coherence, to refine the solution. Our approach requires no geometric information of the scenes and is sufficient for real‐time performance. Experiments show that with the lighting parameters recovered by our method, virtual characters can be seamlessly integrated into the live video. Copyright © 2010 John Wiley & Sons, Ltd.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here