A light-weight quantum self-attention model for classical data classification
As an interdisciplinary field combining quantum computation and machine learning, Quantum Machine Learning (QML) has shown the potential to outperform classical machine learning on some algorithms. Given that the transformer, with self-attention as its core mechanism, has become a popular backbone m...
Gespeichert in:
Veröffentlicht in: | Applied intelligence (Dordrecht, Netherlands) Netherlands), 2024-02, Vol.54 (4), p.3077-3091 |
---|---|
Hauptverfasser: | , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | 3091 |
---|---|
container_issue | 4 |
container_start_page | 3077 |
container_title | Applied intelligence (Dordrecht, Netherlands) |
container_volume | 54 |
creator | Zhang, Hui Zhao, Qinglin Chen, Chuangtao |
description | As an interdisciplinary field combining quantum computation and machine learning, Quantum Machine Learning (QML) has shown the potential to outperform classical machine learning on some algorithms. Given that the transformer, with self-attention as its core mechanism, has become a popular backbone model in the field of machine learning, the exploration of a quantum version of the self-attention mechanism has become an intriguing topic. In this paper, we propose a Quantum Self-Attention Model (QSAM) based on Variational Quantum Algorithms (VQA), aiming to combine the advantages of quantum neural network and self-attention together. To implement the self-attention mechanism on quantum neural network, we employ parameterized quantum circuits to learn the features of input data in quantum-enhanced spaces, then introduce the innovative Amplitude-Phase Decomposition Measurement (APDM) to obtain the essential components of self-attention model:
query
,
key
and
value
. By introducing APDM, we can implement the quantum self-attention model with a lower parameter quantity than that of previous methods, making our QSAM have a better deployability on near-term quantum devices. We apply QSAM on both NLP and CV datasets for binary and multiple classification. The results show that our QSAM outperforms its classical counterpart and is as good as the state-of-the-art quantum self-attention model on NLP datasets. On CV datasets, our QSAM achieves better performance than other quantum image classifiers. These results demonstrate the powerful learning ability of our QSAM. |
doi_str_mv | 10.1007/s10489-024-05337-w |
format | Article |
fullrecord | <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_3031429875</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>3031429875</sourcerecordid><originalsourceid>FETCH-LOGICAL-c270t-6c4d95b0fad519d757564a36a0b6328ca0f20b5a5075f64e67d57473e509fbe93</originalsourceid><addsrcrecordid>eNp9kE9LAzEQxYMoWKtfwFPAc3Sy-dccS1ErVLwoeAvZ3aRu2e62SZbitzd1C948PWbmvTfwQ-iWwj0FUA-RAp9pAgUnIBhT5HCGJlQoRhTX6hxNQOeTlPrzEl3FuAEAxoBO0Osct836K5GDOwreD7ZLwxZH13piU3JdavoOb_vatdj3AVetjbGpbItrm-xp9Hlx9F2jC2_b6G5OOkUfT4_viyVZvT2_LOYrUhUKEpEVr7UowdtaUF0roYTklkkLpWTFrLLgCyiFFaCEl9xJVQvFFXMCtC-dZlN0N_buQr8fXExm0w-hyy8NA0Z5oWdKZFcxuqrQxxicN7vQbG34NhTMEZsZsZmMzfxiM4ccYmMoZnO3duGv-p_UD0C_cMk</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>3031429875</pqid></control><display><type>article</type><title>A light-weight quantum self-attention model for classical data classification</title><source>SpringerNature Journals</source><creator>Zhang, Hui ; Zhao, Qinglin ; Chen, Chuangtao</creator><creatorcontrib>Zhang, Hui ; Zhao, Qinglin ; Chen, Chuangtao</creatorcontrib><description>As an interdisciplinary field combining quantum computation and machine learning, Quantum Machine Learning (QML) has shown the potential to outperform classical machine learning on some algorithms. Given that the transformer, with self-attention as its core mechanism, has become a popular backbone model in the field of machine learning, the exploration of a quantum version of the self-attention mechanism has become an intriguing topic. In this paper, we propose a Quantum Self-Attention Model (QSAM) based on Variational Quantum Algorithms (VQA), aiming to combine the advantages of quantum neural network and self-attention together. To implement the self-attention mechanism on quantum neural network, we employ parameterized quantum circuits to learn the features of input data in quantum-enhanced spaces, then introduce the innovative Amplitude-Phase Decomposition Measurement (APDM) to obtain the essential components of self-attention model:
query
,
key
and
value
. By introducing APDM, we can implement the quantum self-attention model with a lower parameter quantity than that of previous methods, making our QSAM have a better deployability on near-term quantum devices. We apply QSAM on both NLP and CV datasets for binary and multiple classification. The results show that our QSAM outperforms its classical counterpart and is as good as the state-of-the-art quantum self-attention model on NLP datasets. On CV datasets, our QSAM achieves better performance than other quantum image classifiers. These results demonstrate the powerful learning ability of our QSAM.</description><identifier>ISSN: 0924-669X</identifier><identifier>EISSN: 1573-7497</identifier><identifier>DOI: 10.1007/s10489-024-05337-w</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Algorithms ; Artificial Intelligence ; Classification ; Computer Science ; Datasets ; Machine learning ; Machines ; Manufacturing ; Mechanical Engineering ; Neural networks ; Phase decomposition ; Processes ; Quantum computing ; Weight reduction</subject><ispartof>Applied intelligence (Dordrecht, Netherlands), 2024-02, Vol.54 (4), p.3077-3091</ispartof><rights>The Author(s), under exclusive licence to Springer Science+Business Media, LLC, part of Springer Nature 2024. Springer Nature or its licensor (e.g. a society or other partner) holds exclusive rights to this article under a publishing agreement with the author(s) or other rightsholder(s); author self-archiving of the accepted manuscript version of this article is solely governed by the terms of such publishing agreement and applicable law.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><cites>FETCH-LOGICAL-c270t-6c4d95b0fad519d757564a36a0b6328ca0f20b5a5075f64e67d57473e509fbe93</cites><orcidid>0009-0000-7864-2116</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10489-024-05337-w$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10489-024-05337-w$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27924,27925,41488,42557,51319</link.rule.ids></links><search><creatorcontrib>Zhang, Hui</creatorcontrib><creatorcontrib>Zhao, Qinglin</creatorcontrib><creatorcontrib>Chen, Chuangtao</creatorcontrib><title>A light-weight quantum self-attention model for classical data classification</title><title>Applied intelligence (Dordrecht, Netherlands)</title><addtitle>Appl Intell</addtitle><description>As an interdisciplinary field combining quantum computation and machine learning, Quantum Machine Learning (QML) has shown the potential to outperform classical machine learning on some algorithms. Given that the transformer, with self-attention as its core mechanism, has become a popular backbone model in the field of machine learning, the exploration of a quantum version of the self-attention mechanism has become an intriguing topic. In this paper, we propose a Quantum Self-Attention Model (QSAM) based on Variational Quantum Algorithms (VQA), aiming to combine the advantages of quantum neural network and self-attention together. To implement the self-attention mechanism on quantum neural network, we employ parameterized quantum circuits to learn the features of input data in quantum-enhanced spaces, then introduce the innovative Amplitude-Phase Decomposition Measurement (APDM) to obtain the essential components of self-attention model:
query
,
key
and
value
. By introducing APDM, we can implement the quantum self-attention model with a lower parameter quantity than that of previous methods, making our QSAM have a better deployability on near-term quantum devices. We apply QSAM on both NLP and CV datasets for binary and multiple classification. The results show that our QSAM outperforms its classical counterpart and is as good as the state-of-the-art quantum self-attention model on NLP datasets. On CV datasets, our QSAM achieves better performance than other quantum image classifiers. These results demonstrate the powerful learning ability of our QSAM.</description><subject>Algorithms</subject><subject>Artificial Intelligence</subject><subject>Classification</subject><subject>Computer Science</subject><subject>Datasets</subject><subject>Machine learning</subject><subject>Machines</subject><subject>Manufacturing</subject><subject>Mechanical Engineering</subject><subject>Neural networks</subject><subject>Phase decomposition</subject><subject>Processes</subject><subject>Quantum computing</subject><subject>Weight reduction</subject><issn>0924-669X</issn><issn>1573-7497</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><recordid>eNp9kE9LAzEQxYMoWKtfwFPAc3Sy-dccS1ErVLwoeAvZ3aRu2e62SZbitzd1C948PWbmvTfwQ-iWwj0FUA-RAp9pAgUnIBhT5HCGJlQoRhTX6hxNQOeTlPrzEl3FuAEAxoBO0Osct836K5GDOwreD7ZLwxZH13piU3JdavoOb_vatdj3AVetjbGpbItrm-xp9Hlx9F2jC2_b6G5OOkUfT4_viyVZvT2_LOYrUhUKEpEVr7UowdtaUF0roYTklkkLpWTFrLLgCyiFFaCEl9xJVQvFFXMCtC-dZlN0N_buQr8fXExm0w-hyy8NA0Z5oWdKZFcxuqrQxxicN7vQbG34NhTMEZsZsZmMzfxiM4ccYmMoZnO3duGv-p_UD0C_cMk</recordid><startdate>20240201</startdate><enddate>20240201</enddate><creator>Zhang, Hui</creator><creator>Zhao, Qinglin</creator><creator>Chen, Chuangtao</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>7SC</scope><scope>8FD</scope><scope>JQ2</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><orcidid>https://orcid.org/0009-0000-7864-2116</orcidid></search><sort><creationdate>20240201</creationdate><title>A light-weight quantum self-attention model for classical data classification</title><author>Zhang, Hui ; Zhao, Qinglin ; Chen, Chuangtao</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c270t-6c4d95b0fad519d757564a36a0b6328ca0f20b5a5075f64e67d57473e509fbe93</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Artificial Intelligence</topic><topic>Classification</topic><topic>Computer Science</topic><topic>Datasets</topic><topic>Machine learning</topic><topic>Machines</topic><topic>Manufacturing</topic><topic>Mechanical Engineering</topic><topic>Neural networks</topic><topic>Phase decomposition</topic><topic>Processes</topic><topic>Quantum computing</topic><topic>Weight reduction</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Zhang, Hui</creatorcontrib><creatorcontrib>Zhao, Qinglin</creatorcontrib><creatorcontrib>Chen, Chuangtao</creatorcontrib><collection>CrossRef</collection><collection>Computer and Information Systems Abstracts</collection><collection>Technology Research Database</collection><collection>ProQuest Computer Science Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Zhang, Hui</au><au>Zhao, Qinglin</au><au>Chen, Chuangtao</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>A light-weight quantum self-attention model for classical data classification</atitle><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle><stitle>Appl Intell</stitle><date>2024-02-01</date><risdate>2024</risdate><volume>54</volume><issue>4</issue><spage>3077</spage><epage>3091</epage><pages>3077-3091</pages><issn>0924-669X</issn><eissn>1573-7497</eissn><abstract>As an interdisciplinary field combining quantum computation and machine learning, Quantum Machine Learning (QML) has shown the potential to outperform classical machine learning on some algorithms. Given that the transformer, with self-attention as its core mechanism, has become a popular backbone model in the field of machine learning, the exploration of a quantum version of the self-attention mechanism has become an intriguing topic. In this paper, we propose a Quantum Self-Attention Model (QSAM) based on Variational Quantum Algorithms (VQA), aiming to combine the advantages of quantum neural network and self-attention together. To implement the self-attention mechanism on quantum neural network, we employ parameterized quantum circuits to learn the features of input data in quantum-enhanced spaces, then introduce the innovative Amplitude-Phase Decomposition Measurement (APDM) to obtain the essential components of self-attention model:
query
,
key
and
value
. By introducing APDM, we can implement the quantum self-attention model with a lower parameter quantity than that of previous methods, making our QSAM have a better deployability on near-term quantum devices. We apply QSAM on both NLP and CV datasets for binary and multiple classification. The results show that our QSAM outperforms its classical counterpart and is as good as the state-of-the-art quantum self-attention model on NLP datasets. On CV datasets, our QSAM achieves better performance than other quantum image classifiers. These results demonstrate the powerful learning ability of our QSAM.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10489-024-05337-w</doi><tpages>15</tpages><orcidid>https://orcid.org/0009-0000-7864-2116</orcidid></addata></record> |
fulltext | fulltext |
identifier | ISSN: 0924-669X |
ispartof | Applied intelligence (Dordrecht, Netherlands), 2024-02, Vol.54 (4), p.3077-3091 |
issn | 0924-669X 1573-7497 |
language | eng |
recordid | cdi_proquest_journals_3031429875 |
source | SpringerNature Journals |
subjects | Algorithms Artificial Intelligence Classification Computer Science Datasets Machine learning Machines Manufacturing Mechanical Engineering Neural networks Phase decomposition Processes Quantum computing Weight reduction |
title | A light-weight quantum self-attention model for classical data classification |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-19T18%3A27%3A04IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=A%20light-weight%20quantum%20self-attention%20model%20for%20classical%20data%20classification&rft.jtitle=Applied%20intelligence%20(Dordrecht,%20Netherlands)&rft.au=Zhang,%20Hui&rft.date=2024-02-01&rft.volume=54&rft.issue=4&rft.spage=3077&rft.epage=3091&rft.pages=3077-3091&rft.issn=0924-669X&rft.eissn=1573-7497&rft_id=info:doi/10.1007/s10489-024-05337-w&rft_dat=%3Cproquest_cross%3E3031429875%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=3031429875&rft_id=info:pmid/&rfr_iscdi=true |