z-logo
open-access-imgOpen Access
A New Motion Segmentation Technique using Foreground-Background Bimodal
Author(s) -
Ma’moun Al-Smadi,
Khairi Abdul Rahim,
Rosalina Abdul Salam
Publication year - 2018
Publication title -
malaysian journal of science, health and technology
Language(s) - English
Resource type - Journals
ISSN - 2601-0003
DOI - 10.33102/mjosht.v2i.44
Subject(s) - background subtraction , computer science , artificial intelligence , foreground detection , computer vision , segmentation , frame (networking) , computation , adaptation (eye) , variance (accounting) , pattern recognition (psychology) , pixel , algorithm , telecommunications , physics , accounting , optics , business
Vehicle detection is a fundamental step in urban traffic surveillance systems, since it provides necessary information for further processing. Conventional techniques utilize either background subtraction or foreground appearance-based detection, which involves either poor adaptation or high computation. The complexity of urban traffic scenarios lies in pose and orientation variations, slow or temporarily stopped vehicles and sudden illumination variations. In this work, a foreground-background bimodal is proposed to adapt for scene variation and complexity. Cumulative frame differencing and sigma-delta estimation are used to model foreground and background respectively. A correction feedback updates each model iteratively and recursively based on the detection mask of the other model. Variance update for sigma-delta estimation was limited to update background temporal activities, while cumulative frame differencing account for moving foreground by discarding limited background variations. Comparative experimental results for typical urban traffic sequences show that the proposed technique achieves robust and accurate detection, which improves adaptation, reduce false detection and satisfy real-time requirements.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here