Premium
Robust Shoeprint Retrieval Method Based on Local‐to‐Global Feature Matching for Real Crime Scenes
Author(s) -
Cui Junjian,
Zhao Xiaorui,
Liu Nini,
Morgachev Sergey,
Li Daixi
Publication year - 2019
Publication title -
journal of forensic sciences
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.715
H-Index - 96
eISSN - 1556-4029
pISSN - 0022-1198
DOI - 10.1111/1556-4029.13894
Subject(s) - crime scene , matching (statistics) , feature (linguistics) , artificial intelligence , pattern recognition (psychology) , computer science , feature matching , computer vision , poison control , feature extraction , criminology , medical emergency , medicine , mathematics , psychology , statistics , linguistics , philosophy
In this study, an automatic and robust crime scene shoeprint retrieval method is proposed. As most shoeprints left at crime scenes are randomly partial and noisy, crime scene shoeprint retrieval is a challenging task. To handle partial, noisy shoeprint images, we employ denoising deep belief network ( DBN ) to extract local features and use spatial pyramid matching ( SPM ) to obtain a local‐to‐global matching score. In this study, 536 query shoeprint images from crime scenes and a large scale database containing 34,768 shoeprint images are used to evaluate the retrieval performance. Experimental results show that the proposed method outperforms other state‐of‐the‐art methods in terms of retrieval accuracy, feature dimension, and retrieval speed. The proposed method achieves a cumulative match score ( CMS ) of 65.67% at top 10 which is 5.60% higher than the second best performing method.