Towards Physically-Realizable Adversarial Attacks in Embodied Vision Navigation

The deployment of embodied navigation agents in safety-critical environments raises concerns about their vulnerability to adversarial attacks on deep neural networks. However, current attack methods often lack practicality due to challenges in transitioning from the digital to the physical world, wh...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Chen, Meng, Tu, Jiawei, Qi, Chao, Dang, Yonghao, Zhou, Feng, Wei, Wei, Yin, Jianqin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Chen, Meng
Tu, Jiawei
Qi, Chao
Dang, Yonghao
Zhou, Feng
Wei, Wei
Yin, Jianqin
description The deployment of embodied navigation agents in safety-critical environments raises concerns about their vulnerability to adversarial attacks on deep neural networks. However, current attack methods often lack practicality due to challenges in transitioning from the digital to the physical world, while existing physical attacks for object detection fail to achieve both multi-view effectiveness and naturalness. To address this, we propose a practical attack method for embodied navigation by attaching adversarial patches with learnable textures and opacity to objects. Specifically, to ensure effectiveness across varying viewpoints, we employ a multi-view optimization strategy based on object-aware sampling, which uses feedback from the navigation model to optimize the patch's texture. To make the patch inconspicuous to human observers, we introduce a two-stage opacity optimization mechanism, where opacity is refined after texture optimization. Experimental results show our adversarial patches reduce navigation success rates by about 40%, outperforming previous methods in practicality, effectiveness, and naturalness. Code is available at: [https://github.com/chen37058/Physical-Attacks-in-Embodied-Navigation].
doi_str_mv 10.48550/arxiv.2409.10071
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2409_10071</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2409_10071</sourcerecordid><originalsourceid>FETCH-arxiv_primary_2409_100713</originalsourceid><addsrcrecordid>eNpjYJA0NNAzsTA1NdBPLKrILNMzMjGw1DM0MDA35GTwD8kvTyxKKVYIyKgszkxOzMmp1A1KTczJrEpMyklVcEwpSy0qTizKTMxRcCwpSUzOLlbIzFNwzU3KT8lMTVEIyyzOzM9T8Essy0xPLAEyeRhY0xJzilN5oTQ3g7yba4izhy7Y6viCoszcxKLKeJAT4sFOMCasAgDqbTyK</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Towards Physically-Realizable Adversarial Attacks in Embodied Vision Navigation</title><source>arXiv.org</source><creator>Chen, Meng ; Tu, Jiawei ; Qi, Chao ; Dang, Yonghao ; Zhou, Feng ; Wei, Wei ; Yin, Jianqin</creator><creatorcontrib>Chen, Meng ; Tu, Jiawei ; Qi, Chao ; Dang, Yonghao ; Zhou, Feng ; Wei, Wei ; Yin, Jianqin</creatorcontrib><description>The deployment of embodied navigation agents in safety-critical environments raises concerns about their vulnerability to adversarial attacks on deep neural networks. However, current attack methods often lack practicality due to challenges in transitioning from the digital to the physical world, while existing physical attacks for object detection fail to achieve both multi-view effectiveness and naturalness. To address this, we propose a practical attack method for embodied navigation by attaching adversarial patches with learnable textures and opacity to objects. Specifically, to ensure effectiveness across varying viewpoints, we employ a multi-view optimization strategy based on object-aware sampling, which uses feedback from the navigation model to optimize the patch's texture. To make the patch inconspicuous to human observers, we introduce a two-stage opacity optimization mechanism, where opacity is refined after texture optimization. Experimental results show our adversarial patches reduce navigation success rates by about 40%, outperforming previous methods in practicality, effectiveness, and naturalness. Code is available at: [https://github.com/chen37058/Physical-Attacks-in-Embodied-Navigation].</description><identifier>DOI: 10.48550/arxiv.2409.10071</identifier><language>eng</language><subject>Computer Science - Computer Vision and Pattern Recognition ; Computer Science - Robotics</subject><creationdate>2024-09</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2409.10071$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2409.10071$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Chen, Meng</creatorcontrib><creatorcontrib>Tu, Jiawei</creatorcontrib><creatorcontrib>Qi, Chao</creatorcontrib><creatorcontrib>Dang, Yonghao</creatorcontrib><creatorcontrib>Zhou, Feng</creatorcontrib><creatorcontrib>Wei, Wei</creatorcontrib><creatorcontrib>Yin, Jianqin</creatorcontrib><title>Towards Physically-Realizable Adversarial Attacks in Embodied Vision Navigation</title><description>The deployment of embodied navigation agents in safety-critical environments raises concerns about their vulnerability to adversarial attacks on deep neural networks. However, current attack methods often lack practicality due to challenges in transitioning from the digital to the physical world, while existing physical attacks for object detection fail to achieve both multi-view effectiveness and naturalness. To address this, we propose a practical attack method for embodied navigation by attaching adversarial patches with learnable textures and opacity to objects. Specifically, to ensure effectiveness across varying viewpoints, we employ a multi-view optimization strategy based on object-aware sampling, which uses feedback from the navigation model to optimize the patch's texture. To make the patch inconspicuous to human observers, we introduce a two-stage opacity optimization mechanism, where opacity is refined after texture optimization. Experimental results show our adversarial patches reduce navigation success rates by about 40%, outperforming previous methods in practicality, effectiveness, and naturalness. Code is available at: [https://github.com/chen37058/Physical-Attacks-in-Embodied-Navigation].</description><subject>Computer Science - Computer Vision and Pattern Recognition</subject><subject>Computer Science - Robotics</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNpjYJA0NNAzsTA1NdBPLKrILNMzMjGw1DM0MDA35GTwD8kvTyxKKVYIyKgszkxOzMmp1A1KTczJrEpMyklVcEwpSy0qTizKTMxRcCwpSUzOLlbIzFNwzU3KT8lMTVEIyyzOzM9T8Essy0xPLAEyeRhY0xJzilN5oTQ3g7yba4izhy7Y6viCoszcxKLKeJAT4sFOMCasAgDqbTyK</recordid><startdate>20240916</startdate><enddate>20240916</enddate><creator>Chen, Meng</creator><creator>Tu, Jiawei</creator><creator>Qi, Chao</creator><creator>Dang, Yonghao</creator><creator>Zhou, Feng</creator><creator>Wei, Wei</creator><creator>Yin, Jianqin</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20240916</creationdate><title>Towards Physically-Realizable Adversarial Attacks in Embodied Vision Navigation</title><author>Chen, Meng ; Tu, Jiawei ; Qi, Chao ; Dang, Yonghao ; Zhou, Feng ; Wei, Wei ; Yin, Jianqin</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-arxiv_primary_2409_100713</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Computer Science - Computer Vision and Pattern Recognition</topic><topic>Computer Science - Robotics</topic><toplevel>online_resources</toplevel><creatorcontrib>Chen, Meng</creatorcontrib><creatorcontrib>Tu, Jiawei</creatorcontrib><creatorcontrib>Qi, Chao</creatorcontrib><creatorcontrib>Dang, Yonghao</creatorcontrib><creatorcontrib>Zhou, Feng</creatorcontrib><creatorcontrib>Wei, Wei</creatorcontrib><creatorcontrib>Yin, Jianqin</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Chen, Meng</au><au>Tu, Jiawei</au><au>Qi, Chao</au><au>Dang, Yonghao</au><au>Zhou, Feng</au><au>Wei, Wei</au><au>Yin, Jianqin</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Towards Physically-Realizable Adversarial Attacks in Embodied Vision Navigation</atitle><date>2024-09-16</date><risdate>2024</risdate><abstract>The deployment of embodied navigation agents in safety-critical environments raises concerns about their vulnerability to adversarial attacks on deep neural networks. However, current attack methods often lack practicality due to challenges in transitioning from the digital to the physical world, while existing physical attacks for object detection fail to achieve both multi-view effectiveness and naturalness. To address this, we propose a practical attack method for embodied navigation by attaching adversarial patches with learnable textures and opacity to objects. Specifically, to ensure effectiveness across varying viewpoints, we employ a multi-view optimization strategy based on object-aware sampling, which uses feedback from the navigation model to optimize the patch's texture. To make the patch inconspicuous to human observers, we introduce a two-stage opacity optimization mechanism, where opacity is refined after texture optimization. Experimental results show our adversarial patches reduce navigation success rates by about 40%, outperforming previous methods in practicality, effectiveness, and naturalness. Code is available at: [https://github.com/chen37058/Physical-Attacks-in-Embodied-Navigation].</abstract><doi>10.48550/arxiv.2409.10071</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2409.10071
ispartof
issn
language eng
recordid cdi_arxiv_primary_2409_10071
source arXiv.org
subjects Computer Science - Computer Vision and Pattern Recognition
Computer Science - Robotics
title Towards Physically-Realizable Adversarial Attacks in Embodied Vision Navigation
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-19T21%3A36%3A19IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Towards%20Physically-Realizable%20Adversarial%20Attacks%20in%20Embodied%20Vision%20Navigation&rft.au=Chen,%20Meng&rft.date=2024-09-16&rft_id=info:doi/10.48550/arxiv.2409.10071&rft_dat=%3Carxiv_GOX%3E2409_10071%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true