Anomaly detection with variational quantum generative adversarial networks

Generative adversarial networks (GANs) are a machine learning framework comprising a generative model for sampling from a target distribution and a discriminative model for evaluating the proximity of a sample to the target distribution. GANs exhibit strong performance in imaging or anomaly detectio...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2021-07
Hauptverfasser: Herr, Daniel, Obert, Benjamin, Rosenkranz, Matthias
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:Generative adversarial networks (GANs) are a machine learning framework comprising a generative model for sampling from a target distribution and a discriminative model for evaluating the proximity of a sample to the target distribution. GANs exhibit strong performance in imaging or anomaly detection. However, they suffer from training instabilities, and sampling efficiency may be limited by the classical sampling procedure. We introduce variational quantum-classical Wasserstein GANs to address these issues and embed this model in a classical machine learning framework for anomaly detection. Classical Wasserstein GANs improve training stability by using a cost function better suited for gradient descent. Our model replaces the generator of Wasserstein GANs with a hybrid quantum-classical neural net and leaves the classical discriminative model unchanged. This way, high-dimensional classical data only enters the classical model and need not be prepared in a quantum circuit. We demonstrate the effectiveness of this method on a credit card fraud dataset. For this dataset our method shows performance on par with classical methods in terms of the \(F_1\) score. We analyze the influence of the circuit ansatz, layer width and depth, neural net architecture parameter initialization strategy, and sampling noise on convergence and performance.
ISSN:2331-8422
DOI:10.48550/arxiv.2010.10492