z-logo
open-access-imgOpen Access
Non-Intrusive Luminance Mapping via High Dynamic Range Imaging and 3-D Reconstruction
Author(s) -
Michael Kim,
Athanasios Tzempelikos
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/2042/1/012113
Subject(s) - luminance , computer vision , computer science , artificial intelligence , high dynamic range , pipeline (software) , context (archaeology) , computer graphics (images) , photogrammetry , projection (relational algebra) , position (finance) , high dynamic range imaging , dynamic range , geography , economics , programming language , archaeology , finance , algorithm
Continuous luminance monitoring is challenging because high-dynamic-range cameras are expensive, they need programming, and are intrusive when placed near the occupants’ field-of-view. A new semi-automated and non-intrusive framework is presented for monitoring occupant-perceived luminance using a low-cost camera sensor and Structure-from- Motion (SfM)-Multiview Stereo (MVS) photogrammetry pipeline. Using a short video and a few photos from the occupant position, the 3D space geometry is automatically reconstructed. Retrieved 3D context enables the back-projection of the camera-captured luminance distribution into 3D spaces that are in turn re-projected to occupant-FOVs. The framework was tested and validated in a testbed office. The re-projected luminance field showed with good agreement with luminance measured at the occupant position. The new method can be used for non-intrusive luminance monitoring integrated with daylighting control applications.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here