z-logo
open-access-imgOpen Access
An Occlusion Handling Evaluation Criterion for Deep Learning Object Segmentation
Author(s) -
Cheng Yang,
Peter Han Joo Chong,
Phan Truong Lam
Publication year - 2021
Publication title -
journal of physics. conference series
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.21
H-Index - 85
eISSN - 1742-6596
pISSN - 1742-6588
DOI - 10.1088/1742-6596/1880/1/012008
Subject(s) - segmentation , artificial intelligence , occlusion , computer vision , object (grammar) , computer science , image segmentation , pixel , deep learning , pattern recognition (psychology) , medicine , surgery
This paper introduces a novel evaluation criterion to occlusion handling for deep learning object segmentation. The occlusion is defined as objects blocking each other on an image. It affects deep learning object segmentation. More and more researches focus on occlusion handling for object segmentation. However, these researches do not clearly show the evaluation of their occlusion handling method, because there is no suitable evaluation criterion. Traditionally, people just use images results or use the entire object boundary accuracy to show the occlusion handling of their methods. Conversely, these ideas cannot give a numerical evaluation focusing on occlusion handling. This research reports an evaluation criterion to measure occlusion handling performance. This evaluation criterion uses the shortest distances between the pixels from the ground truth occlusion edges and the segmentation (result) shape. The shortest distances are segmentation errors. Then, the average value of these errors is the final parameter for occlusion handling evaluation criterion. The experiment uses a deep learning based segmentation model as an example. It shows that this criterion (or method) successfully measures the occlusion handling for deep learning based object segmentation.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here