Quantum Recurrent Neural Networks for Sequential Learning

Quantum neural network (QNN) is one of the promising directions where the near-term noisy intermediate-scale quantum (NISQ) devices could find advantageous applications against classical resources. Recurrent neural networks are the most fundamental networks for sequential learning, but up to now the...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Li, Yanan, Wang, Zhimin, Han, Rongbing, Shi, Shangshang, Li, Jiaxin, Shang, Ruimin, Zheng, Haiyong, Zhong, Guoqiang, Gu, Yongjian
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Li, Yanan
Wang, Zhimin
Han, Rongbing
Shi, Shangshang
Li, Jiaxin
Shang, Ruimin
Zheng, Haiyong
Zhong, Guoqiang
Gu, Yongjian
description Quantum neural network (QNN) is one of the promising directions where the near-term noisy intermediate-scale quantum (NISQ) devices could find advantageous applications against classical resources. Recurrent neural networks are the most fundamental networks for sequential learning, but up to now there is still a lack of canonical model of quantum recurrent neural network (QRNN), which certainly restricts the research in the field of quantum deep learning. In the present work, we propose a new kind of QRNN which would be a good candidate as the canonical QRNN model, where, the quantum recurrent blocks (QRBs) are constructed in the hardware-efficient way, and the QRNN is built by stacking the QRBs in a staggered way that can greatly reduce the algorithm's requirement with regard to the coherent time of quantum devices. That is, our QRNN is much more accessible on NISQ devices. Furthermore, the performance of the present QRNN model is verified concretely using three different kinds of classical sequential data, i.e., meteorological indicators, stock price, and text categorization. The numerical experiments show that our QRNN achieves much better performance in prediction (classification) accuracy against the classical RNN and state-of-the-art QNN models for sequential learning, and can predict the changing details of temporal sequence data. The practical circuit structure and superior performance indicate that the present QRNN is a promising learning model to find quantum advantageous applications in the near term.
doi_str_mv 10.48550/arxiv.2302.03244
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2302_03244</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2302_03244</sourcerecordid><originalsourceid>FETCH-LOGICAL-a674-23c26586724015afbc1a07c116b38675afbb77d597ca82eb5ef407d0aa9134693</originalsourceid><addsrcrecordid>eNotj8uKwkAURHvjYlA_YFbmBxL73clSxMdAGPGxDzdtR4JJ1Gtax783Oq4KTkFRh5BvRiMZK0XHgH_lLeKC8ogKLuUXSdYemtbXwcZZj-iaNvh1HqHqor2f8HgNihMGW3fxXVd2PHWATdkcBqRXQHV1w0_2yW4-202XYbpa_EwnaQjayJALy7WKteGSMgVFbhlQYxnTuejoi-TG7FViLMTc5coVkpo9BUiYkDoRfTL6n31_z85Y1oCP7OWQvR3EE-xrQP4</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Quantum Recurrent Neural Networks for Sequential Learning</title><source>arXiv.org</source><creator>Li, Yanan ; Wang, Zhimin ; Han, Rongbing ; Shi, Shangshang ; Li, Jiaxin ; Shang, Ruimin ; Zheng, Haiyong ; Zhong, Guoqiang ; Gu, Yongjian</creator><creatorcontrib>Li, Yanan ; Wang, Zhimin ; Han, Rongbing ; Shi, Shangshang ; Li, Jiaxin ; Shang, Ruimin ; Zheng, Haiyong ; Zhong, Guoqiang ; Gu, Yongjian</creatorcontrib><description>Quantum neural network (QNN) is one of the promising directions where the near-term noisy intermediate-scale quantum (NISQ) devices could find advantageous applications against classical resources. Recurrent neural networks are the most fundamental networks for sequential learning, but up to now there is still a lack of canonical model of quantum recurrent neural network (QRNN), which certainly restricts the research in the field of quantum deep learning. In the present work, we propose a new kind of QRNN which would be a good candidate as the canonical QRNN model, where, the quantum recurrent blocks (QRBs) are constructed in the hardware-efficient way, and the QRNN is built by stacking the QRBs in a staggered way that can greatly reduce the algorithm's requirement with regard to the coherent time of quantum devices. That is, our QRNN is much more accessible on NISQ devices. Furthermore, the performance of the present QRNN model is verified concretely using three different kinds of classical sequential data, i.e., meteorological indicators, stock price, and text categorization. The numerical experiments show that our QRNN achieves much better performance in prediction (classification) accuracy against the classical RNN and state-of-the-art QNN models for sequential learning, and can predict the changing details of temporal sequence data. The practical circuit structure and superior performance indicate that the present QRNN is a promising learning model to find quantum advantageous applications in the near term.</description><identifier>DOI: 10.48550/arxiv.2302.03244</identifier><language>eng</language><subject>Computer Science - Learning ; Physics - Quantum Physics</subject><creationdate>2023-02</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,776,881</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2302.03244$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2302.03244$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Li, Yanan</creatorcontrib><creatorcontrib>Wang, Zhimin</creatorcontrib><creatorcontrib>Han, Rongbing</creatorcontrib><creatorcontrib>Shi, Shangshang</creatorcontrib><creatorcontrib>Li, Jiaxin</creatorcontrib><creatorcontrib>Shang, Ruimin</creatorcontrib><creatorcontrib>Zheng, Haiyong</creatorcontrib><creatorcontrib>Zhong, Guoqiang</creatorcontrib><creatorcontrib>Gu, Yongjian</creatorcontrib><title>Quantum Recurrent Neural Networks for Sequential Learning</title><description>Quantum neural network (QNN) is one of the promising directions where the near-term noisy intermediate-scale quantum (NISQ) devices could find advantageous applications against classical resources. Recurrent neural networks are the most fundamental networks for sequential learning, but up to now there is still a lack of canonical model of quantum recurrent neural network (QRNN), which certainly restricts the research in the field of quantum deep learning. In the present work, we propose a new kind of QRNN which would be a good candidate as the canonical QRNN model, where, the quantum recurrent blocks (QRBs) are constructed in the hardware-efficient way, and the QRNN is built by stacking the QRBs in a staggered way that can greatly reduce the algorithm's requirement with regard to the coherent time of quantum devices. That is, our QRNN is much more accessible on NISQ devices. Furthermore, the performance of the present QRNN model is verified concretely using three different kinds of classical sequential data, i.e., meteorological indicators, stock price, and text categorization. The numerical experiments show that our QRNN achieves much better performance in prediction (classification) accuracy against the classical RNN and state-of-the-art QNN models for sequential learning, and can predict the changing details of temporal sequence data. The practical circuit structure and superior performance indicate that the present QRNN is a promising learning model to find quantum advantageous applications in the near term.</description><subject>Computer Science - Learning</subject><subject>Physics - Quantum Physics</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj8uKwkAURHvjYlA_YFbmBxL73clSxMdAGPGxDzdtR4JJ1Gtax783Oq4KTkFRh5BvRiMZK0XHgH_lLeKC8ogKLuUXSdYemtbXwcZZj-iaNvh1HqHqor2f8HgNihMGW3fxXVd2PHWATdkcBqRXQHV1w0_2yW4-202XYbpa_EwnaQjayJALy7WKteGSMgVFbhlQYxnTuejoi-TG7FViLMTc5coVkpo9BUiYkDoRfTL6n31_z85Y1oCP7OWQvR3EE-xrQP4</recordid><startdate>20230206</startdate><enddate>20230206</enddate><creator>Li, Yanan</creator><creator>Wang, Zhimin</creator><creator>Han, Rongbing</creator><creator>Shi, Shangshang</creator><creator>Li, Jiaxin</creator><creator>Shang, Ruimin</creator><creator>Zheng, Haiyong</creator><creator>Zhong, Guoqiang</creator><creator>Gu, Yongjian</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20230206</creationdate><title>Quantum Recurrent Neural Networks for Sequential Learning</title><author>Li, Yanan ; Wang, Zhimin ; Han, Rongbing ; Shi, Shangshang ; Li, Jiaxin ; Shang, Ruimin ; Zheng, Haiyong ; Zhong, Guoqiang ; Gu, Yongjian</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a674-23c26586724015afbc1a07c116b38675afbb77d597ca82eb5ef407d0aa9134693</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Computer Science - Learning</topic><topic>Physics - Quantum Physics</topic><toplevel>online_resources</toplevel><creatorcontrib>Li, Yanan</creatorcontrib><creatorcontrib>Wang, Zhimin</creatorcontrib><creatorcontrib>Han, Rongbing</creatorcontrib><creatorcontrib>Shi, Shangshang</creatorcontrib><creatorcontrib>Li, Jiaxin</creatorcontrib><creatorcontrib>Shang, Ruimin</creatorcontrib><creatorcontrib>Zheng, Haiyong</creatorcontrib><creatorcontrib>Zhong, Guoqiang</creatorcontrib><creatorcontrib>Gu, Yongjian</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Li, Yanan</au><au>Wang, Zhimin</au><au>Han, Rongbing</au><au>Shi, Shangshang</au><au>Li, Jiaxin</au><au>Shang, Ruimin</au><au>Zheng, Haiyong</au><au>Zhong, Guoqiang</au><au>Gu, Yongjian</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Quantum Recurrent Neural Networks for Sequential Learning</atitle><date>2023-02-06</date><risdate>2023</risdate><abstract>Quantum neural network (QNN) is one of the promising directions where the near-term noisy intermediate-scale quantum (NISQ) devices could find advantageous applications against classical resources. Recurrent neural networks are the most fundamental networks for sequential learning, but up to now there is still a lack of canonical model of quantum recurrent neural network (QRNN), which certainly restricts the research in the field of quantum deep learning. In the present work, we propose a new kind of QRNN which would be a good candidate as the canonical QRNN model, where, the quantum recurrent blocks (QRBs) are constructed in the hardware-efficient way, and the QRNN is built by stacking the QRBs in a staggered way that can greatly reduce the algorithm's requirement with regard to the coherent time of quantum devices. That is, our QRNN is much more accessible on NISQ devices. Furthermore, the performance of the present QRNN model is verified concretely using three different kinds of classical sequential data, i.e., meteorological indicators, stock price, and text categorization. The numerical experiments show that our QRNN achieves much better performance in prediction (classification) accuracy against the classical RNN and state-of-the-art QNN models for sequential learning, and can predict the changing details of temporal sequence data. The practical circuit structure and superior performance indicate that the present QRNN is a promising learning model to find quantum advantageous applications in the near term.</abstract><doi>10.48550/arxiv.2302.03244</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2302.03244
ispartof
issn
language eng
recordid cdi_arxiv_primary_2302_03244
source arXiv.org
subjects Computer Science - Learning
Physics - Quantum Physics
title Quantum Recurrent Neural Networks for Sequential Learning
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-02T14%3A02%3A08IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Quantum%20Recurrent%20Neural%20Networks%20for%20Sequential%20Learning&rft.au=Li,%20Yanan&rft.date=2023-02-06&rft_id=info:doi/10.48550/arxiv.2302.03244&rft_dat=%3Carxiv_GOX%3E2302_03244%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true