Towards Physically-Realizable Adversarial Attacks in Embodied Vision Navigation
The deployment of embodied navigation agents in safety-critical environments raises concerns about their vulnerability to adversarial attacks on deep neural networks. However, current attack methods often lack practicality due to challenges in transitioning from the digital to the physical world, wh...
Gespeichert in:
Hauptverfasser: | , , , , , , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | The deployment of embodied navigation agents in safety-critical environments
raises concerns about their vulnerability to adversarial attacks on deep neural
networks. However, current attack methods often lack practicality due to
challenges in transitioning from the digital to the physical world, while
existing physical attacks for object detection fail to achieve both multi-view
effectiveness and naturalness. To address this, we propose a practical attack
method for embodied navigation by attaching adversarial patches with learnable
textures and opacity to objects. Specifically, to ensure effectiveness across
varying viewpoints, we employ a multi-view optimization strategy based on
object-aware sampling, which uses feedback from the navigation model to
optimize the patch's texture. To make the patch inconspicuous to human
observers, we introduce a two-stage opacity optimization mechanism, where
opacity is refined after texture optimization. Experimental results show our
adversarial patches reduce navigation success rates by about 40%, outperforming
previous methods in practicality, effectiveness, and naturalness. Code is
available at:
[https://github.com/chen37058/Physical-Attacks-in-Embodied-Navigation]. |
---|---|
DOI: | 10.48550/arxiv.2409.10071 |