z-logo
open-access-imgOpen Access
Segmentation of Enhalus acoroides seagrass from underwater images using the Mask R-CNN method
Author(s) -
Slamet Adji Pamungkas,
I Kadek Noppi Adi Jaya,
M. Iqbal
Publication year - 2021
Publication title -
iop conference series. earth and environmental science
Language(s) - English
Resource type - Journals
eISSN - 1755-1307
pISSN - 1755-1315
DOI - 10.1088/1755-1315/944/1/012010
Subject(s) - seagrass , segmentation , overfitting , artificial intelligence , computer science , underwater , pattern recognition (psychology) , environmental science , artificial neural network , geology , oceanography , ecology , ecosystem , biology
Seagrass is a Spermatophyta plant that has many roles, including as a primary producer in the food chain in the waters. Monitoring of seagrass meadows and conditions needs to be done in order to achieve a healthy marine ecosystem. The steps taken in monitoring seagrass are by detecting and segmenting it. The purpose of the study is to implement and get information about the performance of the Mask R-CNN algorithm in detecting and segmenting the Enhalus acoroides . The dataset consists of 500 Enhalus acoroides images that had gone through a color correction and labelling process. The training process was performed with the configuration of 0.001 learning rate, batch size of 4 and some image augmentation was used to avoid overfitting. The optimum weight value was obtained after conducting the learning process with 100 epochs. A confusion matrix was used to evaluate detection performance, and linear regression was used to evaluate the segmentation produced by the model. The model evaluation results showed an accuracy value of 0.9246, a precision value of 0.9507, a recall value of 0.9712 and a correlation coefficient value of 0.8771. The value indicates that the model can detect and segment the seagrass Enhalus acoroides well and accurately.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here