z-logo
Premium
Temporally Consistent Motion Segmentation From RGB‐D Video
Author(s) -
Bertholet P.,
Ichim A.E.,
Zwicker M.
Publication year - 2018
Publication title -
computer graphics forum
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.578
H-Index - 120
eISSN - 1467-8659
pISSN - 0167-7055
DOI - 10.1111/cgf.13316
Subject(s) - computer science , computer vision , artificial intelligence , segmentation , motion (physics) , rgb color model , structure from motion , energy (signal processing) , energy minimization , motion estimation , mathematics , physics , statistics , quantum mechanics
Temporally consistent motion segmentation from RGB‐D videos is challenging because of the limitations of current RGB‐D sensors. We formulate segmentation as a motion assignment problem, where a motion is a sequence of rigid transformations through all frames of the input. We capture the quality of each potential assignment by defining an appropriate energy function that accounts for occlusions and a sensor‐specific noise model. To make energy minimization tractable, we work with a discrete set instead of the continuous, high dimensional space of motions, where the discrete motion set provides an upper bound for the original energy. We repeatedly minimize our energy, and in each step extend and refine the motion set to further lower the bound. A quantitative comparison to the current state of the art demonstrates the benefits of our approach in difficult scenarios.

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here