z-logo
Premium
Detecting intruders using time‐series data by projection pattern of silhouette
Author(s) -
Kawasumi Hiroaki,
Sekii Hiroshi,
Enomoto Nobuyoshi,
Ohata Hajime,
Okazaki Akio
Publication year - 1997
Publication title -
electrical engineering in japan
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.136
H-Index - 28
eISSN - 1520-6416
pISSN - 0424-7760
DOI - 10.1002/(sici)1520-6416(19970415)119:1<62::aid-eej8>3.0.co;2-h
Subject(s) - silhouette , artificial intelligence , computer vision , projection (relational algebra) , inter frame , computer science , pattern recognition (psychology) , frame (networking) , brightness , similarity (geometry) , series (stratigraphy) , image (mathematics) , reference frame , algorithm , telecommunications , paleontology , physics , optics , biology
This paper describes a method of detecting intruders by video image processing. There are some studies on detecting intruders in video image by analyzing difference‐binary images and motion vectors. Because of the use of simple features such as size and motion vectors of changing regions, these methods are not fully able to reduce erroneous detection caused by other factors, such as small animals, swaying of trees, and changes in brightness. We propose a method of detecting intruders by time‐series data on the projection pattern of the silhouette. In this method, changing regions in video images are extracted by the interframe differential technique and are transformed into projection patterns. Evaluation scores (a measure of similarity to a human silhouette) are calculated by flexible matching between observed patterns and the standard pattern. Discrimination processing is carried out in terms of a time‐series score by tracking a target of each video frame. We have made experiments with a prototype and confirmed the effectiveness of this method. © 1997 Scripta Technica, Inc. Electr Eng Jpn 119(1): 62–73, 1997

This content is not available in your region!

Continue researching here.

Having issues? You can contact us here