z-logo
open-access-imgOpen Access
Using double attention for text tattoo localisation
Author(s) -
Xu Xingpeng,
Prasad Shitala,
Cheng Kuanhong,
Kin Kong Adams Wai
Publication year - 2022
Publication title -
iet biometrics
Language(s) - English
Resource type - Journals
SCImago Journal Rank - 0.434
H-Index - 28
eISSN - 2047-4946
pISSN - 2047-4938
DOI - 10.1049/bme2.12071
Subject(s) - computer science , object (grammar) , artificial intelligence , neglect , text detection , detector , mechanism (biology) , text recognition , information retrieval , natural language processing , image (mathematics) , psychology , telecommunications , psychiatry , philosophy , epistemology
Text tattoos contain rich information about an individual for forensic investigation. To extract this information, text tattoo localisation is the first and essential step. Previous tattoo studies applied existing object detectors to detect general tattoos, but none of them considered text tattoo localisation and they neglect the prior knowledge that text tattoos are usually inside or nearby larger tattoos and appear only on human skin. To use this prior knowledge, a prior knowledge‐based attention mechanism (PKAM) and a network named Text Tattoo Localisation Network based on Double Attention (TTLN‐DA) are proposed. In addition to TTLN‐DA, two variants of TTLN‐DA are designed to study the effectiveness of different prior knowledge. For this study, NTU Tattoo V2, the largest tattoo dataset and NTU Text Tattoo V1, the largest text tattoo dataset are established. To examine the importance of the prior knowledge and the effectiveness of the proposed attention mechanism and the networks, TTLN‐DA and its variants are compared with state‐of‐the‐art object detectors and text detectors. The experimental results indicate that the prior knowledge is vital for text tattoo localisation; The PKAM contributes significantly to the performance and TTLN‐DA outperforms the state‐of‐the‐art object detectors and scene text detectors.

The content you want is available to Zendy users.

Already have an account? Click here to sign in.
Having issues? You can contact us here