Using double attention for text tattoo localisation
Text tattoos contain rich information about an individual for forensic investigation. To extract this information, text tattoo localisation is the first and essential step. Previous tattoo studies applied existing object detectors to detect general tattoos, but none of them considered text tattoo lo...
Gespeichert in:
Veröffentlicht in: | IET biometrics 2022-05, Vol.11 (3), p.199-214 |
---|---|
Hauptverfasser: | , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Text tattoos contain rich information about an individual for forensic investigation. To extract this information, text tattoo localisation is the first and essential step. Previous tattoo studies applied existing object detectors to detect general tattoos, but none of them considered text tattoo localisation and they neglect the prior knowledge that text tattoos are usually inside or nearby larger tattoos and appear only on human skin. To use this prior knowledge, a prior knowledge‐based attention mechanism (PKAM) and a network named Text Tattoo Localisation Network based on Double Attention (TTLN‐DA) are proposed. In addition to TTLN‐DA, two variants of TTLN‐DA are designed to study the effectiveness of different prior knowledge. For this study, NTU Tattoo V2, the largest tattoo dataset and NTU Text Tattoo V1, the largest text tattoo dataset are established. To examine the importance of the prior knowledge and the effectiveness of the proposed attention mechanism and the networks, TTLN‐DA and its variants are compared with state‐of‐the‐art object detectors and text detectors. The experimental results indicate that the prior knowledge is vital for text tattoo localisation; The PKAM contributes significantly to the performance and TTLN‐DA outperforms the state‐of‐the‐art object detectors and scene text detectors. |
---|---|
ISSN: | 2047-4938 2047-4946 |
DOI: | 10.1049/bme2.12071 |