z-logo
open-access-imgOpen Access
Dynamic thresholding for video anomaly detection
Author(s) -
Jia Diyang,
Zhang Xiao,
Zhou Joey Tianyi,
Lai Pan,
Wei Yifei
Publication year - 2022
Publication title -
iet image processing
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.401
H-Index - 45
eISSN - 1751-9667
pISSN - 1751-9659
DOI - 10.1049/ipr2.12532
Subject(s) - thresholding , anomaly detection , computer science , frame (networking) , anomaly (physics) , artificial intelligence , ground truth , pattern recognition (psychology) , computer vision , image (mathematics) , data mining , telecommunications , physics , condensed matter physics
Anomaly detection is one of the most important applications in video surveillance that involves the temporal localisation of anomaly events in unannotated video sequences. By learning the normal patterns to generate frames and calculating their reconstruction error relative to the ground truth, a frame can be recognised as being abnormal if the reconstruction error exceeds a threshold. Most existing works use a fixed threshold that computes over all the testing data to determine the anomalies. However, fixed threshold strategy cannot address the challenges brought by the dynamic environment, e.g. changes in illumination conditions. In this paper, a dynamic thresholding algorithm (DTA) is proposed, which is fully data‐driven and capable of automatically determining thresholds such that the developed anomaly detection system can flexibly adapt to different scenarios. The proposed DTA is independent of the backbone network and can be easily incorporated into most existing video anomaly detection models to help identify the appropriate thresholds. On both synthetic and real‐world datasets, the experimental results show that with the proposed DTA, the video anomaly detection methods achieve a better performance considering the changes in dynamic environment.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here