A light-weight quantum self-attention model for classical data classification
As an interdisciplinary field combining quantum computation and machine learning, Quantum Machine Learning (QML) has shown the potential to outperform classical machine learning on some algorithms. Given that the transformer, with self-attention as its core mechanism, has become a popular backbone m...
Gespeichert in:
Veröffentlicht in: | Applied intelligence (Dordrecht, Netherlands) Netherlands), 2024-02, Vol.54 (4), p.3077-3091 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | As an interdisciplinary field combining quantum computation and machine learning, Quantum Machine Learning (QML) has shown the potential to outperform classical machine learning on some algorithms. Given that the transformer, with self-attention as its core mechanism, has become a popular backbone model in the field of machine learning, the exploration of a quantum version of the self-attention mechanism has become an intriguing topic. In this paper, we propose a Quantum Self-Attention Model (QSAM) based on Variational Quantum Algorithms (VQA), aiming to combine the advantages of quantum neural network and self-attention together. To implement the self-attention mechanism on quantum neural network, we employ parameterized quantum circuits to learn the features of input data in quantum-enhanced spaces, then introduce the innovative Amplitude-Phase Decomposition Measurement (APDM) to obtain the essential components of self-attention model:
query
,
key
and
value
. By introducing APDM, we can implement the quantum self-attention model with a lower parameter quantity than that of previous methods, making our QSAM have a better deployability on near-term quantum devices. We apply QSAM on both NLP and CV datasets for binary and multiple classification. The results show that our QSAM outperforms its classical counterpart and is as good as the state-of-the-art quantum self-attention model on NLP datasets. On CV datasets, our QSAM achieves better performance than other quantum image classifiers. These results demonstrate the powerful learning ability of our QSAM. |
---|---|
ISSN: | 0924-669X 1573-7497 |
DOI: | 10.1007/s10489-024-05337-w |