z-logo
open-access-imgOpen Access
Low‐rank structured sparse representation and reduced dictionary learning‐based abnormity detection
Author(s) -
Xie Wenbin,
Yin Hong,
Wang Meini,
Shao Yan,
Yu Bosi
Publication year - 2019
Publication title -
iet computer vision
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.38
H-Index - 37
eISSN - 1751-9640
pISSN - 1751-9632
DOI - 10.1049/iet-cvi.2018.5256
Subject(s) - sparse approximation , computer science , artificial intelligence , pattern recognition (psychology) , rank (graph theory) , feature (linguistics) , representation (politics) , encoding (memory) , property (philosophy) , k svd , dictionary learning , sparse matrix , feature learning , sequence (biology) , mathematics , linguistics , philosophy , physics , epistemology , combinatorics , quantum mechanics , politics , political science , law , gaussian , genetics , biology
A novel abnormity detection method is presented which combines the low‐rank structured sparse representation and reduced dictionary learning. The multi‐scale three‐dimensional gradient is used as low‐level feature by encoding the spatiotemporal information. A group of reduced sparse dictionaries is learnt by low‐rank approximation based on the structured sparsity property of the video sequence. The contribution of this study is three‐fold: (i) the normal feature clusters can be represented effectively by the reduced dictionaries which are learnt based on the low‐rank nature of the data; (ii) the size of dictionary is determined adaptively by the sparse learning method according to the scene, which makes the representation more compact and efficient; and (iii) the proposed abnormity detection method is of low time complexity and real‐time detection can be obtained. The authors have evaluated the proposed method against the state‐of‐the‐art methods on the public datasets and very promising results have been achieved.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here