z-logo
open-access-imgOpen Access
Generation of cancellable locality sampled codes from facial images
Author(s) -
Sadhya Debanjan,
Utsav Utkarsh,
Akhtar Zahid
Publication year - 2021
Publication title -
iet biometrics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.434
H-Index - 28
eISSN - 2047-4946
pISSN - 2047-4938
DOI - 10.1049/bme2.12016
Subject(s) - computer science , locality , artificial intelligence , pattern recognition (psychology) , computer vision , computer graphics (images) , speech recognition , philosophy , linguistics
Face is arguably the most common biometric trait that has been extensively utilised and thoroughly studied. Since this unimodal feature visually represents the identity of an individual, preserving the security of face‐based authentication models is a prime concern. This work proposes a framework for generating cancellable templates from raw facial images. Our scheme is essentially based upon the notion of locality sensitive hashing (LSH), specifically on its locality sampled code (LSC) realisation. Facial features are initially extracted using the binarized statistical image features (BSIF) descriptor. These binary features are subsequently hashed using the random bit sampling mechanism of LSC. Finally, these local hashes are permanently stored in a non‐invertible manner. We have empirically analysed the security requirements of unlinkability, non‐invertibility, and revocability in our model. We have also validated our work over the benchmark AR, ORL, Yale, and CASIA‐Facev5 databases under multiple scenarios. Among all the resulting cases, the best performance of our model was noted at a minimum EER of 2.69%, 4.45%, 1.2%, and 2.66% for the four data sets, respectively.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here