z-logo
open-access-imgOpen Access
Integrating Region and Boundary Information for Improved Spatial Coherencein Object Tracking
Author(s) -
Desmond Chung,
W. James MacLean,
Sven Dickinson
Publication year - 2004
Publication title -
proceedings of the 2004 ieee computer society conference on computer vision and pattern recognition, 2004. cvpr 2004.
Language(s) - English
Resource type - Book series
ISBN - 0-7695-2158-4
DOI - 10.1109/cvpr.2004.96
This paper describes a novel method for performing spa- tially coherent motion estimation by integrating region and boundary information. The method begins with a layered, parametric flow model. Since the resulting flow estimates are typically sparse, we use the computed motion in a novel way to compare intensity values between images, thereby providing improved spatial coherence of a moving region. This dense set of intensity constraints is then used to ini- tialize an active contour, which is influenced by both motion and intensity data to track the object's boundary. The active contour, in turn, provides additional spatial coherence by identifying motion constraints within the ob- ject boundary and using them exclusively in subsequent motion estimation for that object. The active contour is therefore automatically initialized once and, in subsequent frames, is warped forward based on the motion model. The spatial coherence constraints provided by both the motion and the boundary information act together to overcome their individual limitations. Furthermore, the approach is general, and makes no assumptions about a static back- ground and/or a static camera. We apply the method to im- age sequences in which both the object and the background are moving.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here
Accelerating Research

Address

John Eccles House
Robert Robinson Avenue,
Oxford Science Park, Oxford
OX4 4GP, United Kingdom