
Controlled experiment of underwater vision-based mapping: A preliminary evaluation
Author(s) -
Fickrie Muhammad,
Poerbandono,
Harald Sternberg
Publication year - 2021
Publication title -
iop conference series. earth and environmental science
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.179
H-Index - 26
eISSN - 1755-1307
pISSN - 1755-1315
DOI - 10.1088/1755-1315/925/1/012054
Subject(s) - computer vision , artificial intelligence , underwater , computer science , sonar , simultaneous localization and mapping , fiducial marker , robot , position (finance) , calibration , global positioning system , frame (networking) , mobile robot , geography , mathematics , telecommunications , statistics , archaeology , finance , economics
Underwater vision-based mapping (VbM) constructs three-dimensional (3D) map and robot position simultaneously out of a quasi-continuous structure from motion (SfM) method. It is the so-called simultaneous localization and mapping (SLAM), which might be beneficial for mapping of shallow seabed features as it is free from unnecessary parasitic returns which is found in sonar survey. This paper presents a discussion resulted from a small-scale testing of 3D underwater positioning task. We analyse the setting and performance of a standard web-camera, used for such a task, while fully submerged underwater. SLAM estimates the robot (i.e. camera) position from the constructed 3D map by reprojecting the detected features (points) to the camera scene. A marker-based camera calibration is used to eliminate refractions effect due to light propagation in water column. To analyse the positioning accuracy, a fiducial marker-based system –with millimetres accuracy of reprojection error– is used as a trajectory’s true value (ground truth). Controlled experiment with a standard web-camera running with 30 fps (frame per-second) shows that such a system is capable to robustly performing underwater navigation task. Sub-metre accuracy is achieved utilizing at least 1 pose (1 Hz) every second.