z-logo
Premium
True Color Correction of Autonomous Underwater Vehicle Imagery
Author(s) -
Bryson Mitch,
JohnsonRoberson Matthew,
Pizarro Oscar,
Williams Stefan B.
Publication year - 2016
Publication title -
journal of field robotics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 1.152
H-Index - 96
eISSN - 1556-4967
pISSN - 1556-4959
DOI - 10.1002/rob.21638
Subject(s) - vignetting , underwater , computer vision , artificial intelligence , color correction , computer science , ocean color , image stitching , remote sensing , photogrammetry , geology , image (mathematics) , engineering , satellite , oceanography , aerospace engineering , petroleum engineering , lens (geology)
This paper presents an automated approach to recovering the true color of objects on the seafloor in images collected from multiple perspectives by an autonomous underwater vehicle (AUV) during the construction of three‐dimensional (3D) seafloor models and image mosaics. When capturing images underwater, the water column induces several effects on light that are typically negligible in air, such as color‐dependent attenuation and backscatter. AUVs must typically carry artificial lighting when operating at depths below 20‐30 m; the lighting pattern generated is usually not spatially consistent. These effects cause problems for human interpretation of images, limit the ability of using color to identify benthic biota or quantify changes over multiple dives, and confound computer‐based techniques for clustering and classification. Our approach exploits the 3D structure of the scene generated using structure‐from‐motion and photogrammetry techniques to provide basic spatial data to an underwater image formation model. Parameters that are dependent on the properties of the water column are estimated from the image data itself, rather than using fixed in situ infrastructure, such as reflectance panels or detailed data on water constitutes. The model accounts for distance‐based attenuation and backscatter, camera vignetting and the artificial lighting pattern, recovering measurements of the true color (reflectance) and thus allows us to approximate the appearance of the scene as if imaged in air and illuminated from above. Our method is validated against known color targets using imagery collected in different underwater environments by two AUVs that are routinely used as part of a benthic habitat monitoring program.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here