DePatch: Towards Robust Adversarial Patch for Evading Person Detectors in the Real World
Recent years have seen an increasing interest in physical adversarial attacks, which aim to craft deployable patterns for deceiving deep neural networks, especially for person detectors. However, the adversarial patterns of existing patch-based attacks heavily suffer from the self-coupling issue, wh...
Gespeichert in:
Hauptverfasser: | , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Recent years have seen an increasing interest in physical adversarial
attacks, which aim to craft deployable patterns for deceiving deep neural
networks, especially for person detectors. However, the adversarial patterns of
existing patch-based attacks heavily suffer from the self-coupling issue, where
a degradation, caused by physical transformations, in any small patch segment
can result in a complete adversarial dysfunction, leading to poor robustness in
the complex real world. Upon this observation, we introduce the Decoupled
adversarial Patch (DePatch) attack to address the self-coupling issue of
adversarial patches. Specifically, we divide the adversarial patch into
block-wise segments, and reduce the inter-dependency among these segments
through randomly erasing out some segments during the optimization. We further
introduce a border shifting operation and a progressive decoupling strategy to
improve the overall attack capabilities. Extensive experiments demonstrate the
superior performance of our method over other physical adversarial attacks,
especially in the real world. |
---|---|
DOI: | 10.48550/arxiv.2408.06625 |