z-logo
open-access-imgOpen Access
Foreground extraction for freely moving RGBD cameras
Author(s) -
Junejo Imran N.,
Ahmed Naveed
Publication year - 2018
Publication title -
iet computer vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.38
H-Index - 37
eISSN - 1751-9640
pISSN - 1751-9632
DOI - 10.1049/iet-cvi.2017.0187
Subject(s) - artificial intelligence , computer science , background subtraction , computer vision , rgb color model , histogram , frame (networking) , segmentation , pattern recognition (psychology) , support vector machine , histogram of oriented gradients , feature extraction , image (mathematics) , pixel , telecommunications
In this study, the authors propose a novel method to perform foreground extraction for freely moving RGBD cameras. Although the field of foreground extraction or background subtraction has been explored by the computer vision researcher community since a long time, the depth‐based subtraction is relatively new and has not been extensively addressed as of yet. Most of the current methods make heavy use of geometric reconstruction, making the solutions quite restrictive. In this study, the authors make a novel use of RGB and depth data: from the RGB frame, they first extract corner features and then represent these features with the histogram of oriented gradients (HoG) descriptor. They then train a non‐linear SVM on these HoG descriptors. During the test phase, they make use of the fact that the foreground object has a distinct depth ordering with respect to the rest of the scene. Hence, they use the positively classified features from accelerated segment test (FAST) features on the test frame to initiate a region growing algorithm to obtain an accurate foreground segmentation from the depth data alone. The authors demonstrate the proposed method on six datasets, and demonstrate encouraging quantitative and qualitative results.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here