Evolutionary Art Attack For Black-Box Adversarial Example Generation

Deep neural networks (DNNs) have achieved remarkable performance in various tasks, including image classification. However, recent research has revealed the susceptibility of trained DNNs to subtle perturbations introduced into input images. Addressing these vulnerabilities is pivotal, leading to a...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:IEEE transactions on evolutionary computation 2024, p.1-1
Hauptverfasser: Williams, Phoenix Neale, Li, Ke, Min, Geyong
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Deep neural networks (DNNs) have achieved remarkable performance in various tasks, including image classification. However, recent research has revealed the susceptibility of trained DNNs to subtle perturbations introduced into input images. Addressing these vulnerabilities is pivotal, leading to a significant area of study focused on developing attack algorithms capable of generating potent adversarial images. In scenarios where access to gradient information is restricted (black-box scenario), many existing methods introduce optimized perturbations to each individual pixels of an image to cause trained DNNs to mis-classify. However, due to the high-dimensional nature of this approach, current methods have inherent limitations. In contrast, our proposed approach involves the construction of perturbations by concatenating a series of overlapping semi-transparent shapes. Through the optimization of these shapes' characteristics, we generate perturbations that result in the desired misclassification by the DNN. By conducting a series of attacks on state-of-the-art DNNs trained of CIFAR-10 and Imagenet datasets, our method consistently outperforms existing attack algorithms in terms of both query efficiency and success rate.
ISSN:1089-778X
1941-0026
DOI:10.1109/TEVC.2024.3391063