
Motion and illumination defiant cut detection based on Weber features
Author(s) -
Kar Tejaswini,
Kanungo Priyadarshi
Publication year - 2018
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/iet-ipr.2017.1237
Subject(s) - computer science , benchmark (surveying) , artificial intelligence , feature (linguistics) , segmentation , credence , pixel , motion (physics) , computer vision , pattern recognition (psychology) , optical flow , exploit , flicker , key (lock) , image (mathematics) , machine learning , linguistics , philosophy , computer security , geodesy , geography , operating system
The spontaneous proliferation of video data necessitates implementing hierarchical structures for various content management applications. Temporal video segmentation is the key towards such management. To address the problem of temporal segmentation, the current communication exploits the concept of psychological behaviour of the human visual system. Towards this goal an abrupt cut detection scheme has been proposed based on Weber's law which provides a strong spatial correlation among the neighbouring pixels. Thus, the authors provide a robust and unique solution for abrupt shot boundary detection when the frames are affected partially or fully by flashlight, fire and flicker, high motion associated with an object or camera. Further, they have devised a model for generating an automatic threshold, taking into account the statistics of the feature vector which quadrates itself with the variation in the contents of the video. The effectiveness of the proposed framework is validated by exhaustive comparison with few contemporary and recent approaches by using benchmark datasets TRECVID 2001, TRECVID 2002, TRECVID 2007 and some publicly available videos. The results obtained give credence to the remarkable improvement in the performance while preserving a good trade‐off between missed hits and false hits as compared to the state‐of‐the‐art methods.