z-logo
open-access-imgOpen Access
Deep Light Direction Reconstruction from single RGB images
Author(s) -
M. F. Miller,
Alfred Nischwitz,
Rüdiger Westermann
Publication year - 2021
Publication title -
computer science research notes
Language(s) - English
Resource type - Conference proceedings
eISSN - 2464-4625
pISSN - 2464-4617
DOI - 10.24132/csrn.2021.3002.4
Subject(s) - rgb color model , artificial intelligence , computer science , azimuth , computer vision , elevation (ballistics) , elevation angle , iterative reconstruction , deep learning , artificial neural network , computer graphics (images) , optics , mathematics , geometry , physics
In augmented reality applications, consistent illumination between virtual and real objects is important for creatingan immersive user experience. Consistent illumination can be achieved by appropriate parameterisation of thevirtual illumination model, that is consistent with real-world lighting conditions. In this study, we developed amethod to reconstruct the general light direction from red-green-blue (RGB) images of real-world scenes using amodified VGG-16 neural network. We reconstructed the general light direction as azimuth and elevation angles. Toavoid inaccurate results caused by coordinate uncertainty occurring at steep elevation angles, we further introducedstereographically projected coordinates. Unlike recent deep-learning-based approaches for reconstructing the lightsource direction, our approach does not require depth information and thus does not rely on special red-green-blue-depth (RGB-D) images as input.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here