Near-Term Advances in Quantum Natural Language Processing
This paper describes experiments showing that some tasks in natural language processing (NLP) can already be performed using quantum computers, though so far only with small datasets. We demonstrate various approaches to topic classification. The first uses an explicit word-based approach, in which...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2024-04 |
---|---|
Hauptverfasser: | , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Widdows, Dominic Aaranya Alexander Zhu, Daiwei Zimmerman, Chase Majumder, Arunava |
description | This paper describes experiments showing that some tasks in natural language processing (NLP) can already be performed using quantum computers, though so far only with small datasets. We demonstrate various approaches to topic classification. The first uses an explicit word-based approach, in which word-topic scoring weights are implemented as fractional rotations of individual qubit, and a new phrase is classified based on the accumulation of these weights in a scoring qubit using entangling controlled-NOT gates. This is compared with more scalable quantum encodings of word embedding vectors, which are used in the computation of kernel values in a quantum support vector machine: this approach achieved an average of 62% accuracy on classification tasks involving over 10000 words, which is the largest such quantum computing experiment to date. We describe a quantum probability approach to bigram modeling that can be applied to sequences of words and formal concepts, investigating a generative approximation to these distributions using a quantum circuit Born machine, and an approach to ambiguity resolution in verb-noun composition using single-qubit rotations for simple nouns and 2-qubit controlled-NOT gates for simple verbs. The smaller systems described have been run successfully on physical quantum computers, and the larger ones have been simulated. We show that statistically meaningful results can be obtained using real datasets, but this is much more difficult to predict than with easier artificial language examples used previously in developing quantum NLP systems. Other approaches to quantum NLP are compared, partly with respect to contemporary issues including informal language, fluency, and truthfulness. |
doi_str_mv | 10.48550/arxiv.2206.02171 |
format | Article |
fullrecord | <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_2206_02171</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2673709970</sourcerecordid><originalsourceid>FETCH-LOGICAL-a951-c2451193e6c0c384573f6cd1945d5dbb58ab79ecaacaefca232f1c2a4c5201fb3</originalsourceid><addsrcrecordid>eNotz19LwzAUh-EgCI65D-CVAa9bk5OkaS7H8B-MqdD7cpqmpWNNZ9oM_fbWzatz83L4PYTccZbKXCn2iOG7O6UALEsZcM2vyAKE4EkuAW7Iahz3jDHINCglFsTsHIakcKGn6_qE3rqRdp5-RvRT7OkOpxjwQLfo24itox9hmJOx8-0tuW7wMLrV_12S4vmp2Lwm2_eXt816m6BRPLEgFedGuMwyK3KptGgyW3MjVa3qqlI5Vto4i2jRNRZBQMMtoLQKGG8qsST3l7dnV3kMXY_hp_zzlWffXDxcimMYvqIbp3I_xODnTeXMFJoZo5n4BQufUnc</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2673709970</pqid></control><display><type>article</type><title>Near-Term Advances in Quantum Natural Language Processing</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Widdows, Dominic ; Aaranya Alexander ; Zhu, Daiwei ; Zimmerman, Chase ; Majumder, Arunava</creator><creatorcontrib>Widdows, Dominic ; Aaranya Alexander ; Zhu, Daiwei ; Zimmerman, Chase ; Majumder, Arunava</creatorcontrib><description>This paper describes experiments showing that some tasks in natural language processing (NLP) can already be performed using quantum computers, though so far only with small datasets. We demonstrate various approaches to topic classification. The first uses an explicit word-based approach, in which word-topic scoring weights are implemented as fractional rotations of individual qubit, and a new phrase is classified based on the accumulation of these weights in a scoring qubit using entangling controlled-NOT gates. This is compared with more scalable quantum encodings of word embedding vectors, which are used in the computation of kernel values in a quantum support vector machine: this approach achieved an average of 62% accuracy on classification tasks involving over 10000 words, which is the largest such quantum computing experiment to date. We describe a quantum probability approach to bigram modeling that can be applied to sequences of words and formal concepts, investigating a generative approximation to these distributions using a quantum circuit Born machine, and an approach to ambiguity resolution in verb-noun composition using single-qubit rotations for simple nouns and 2-qubit controlled-NOT gates for simple verbs. The smaller systems described have been run successfully on physical quantum computers, and the larger ones have been simulated. We show that statistically meaningful results can be obtained using real datasets, but this is much more difficult to predict than with easier artificial language examples used previously in developing quantum NLP systems. Other approaches to quantum NLP are compared, partly with respect to contemporary issues including informal language, fluency, and truthfulness.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.2206.02171</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Algorithms ; Ambiguity resolution (mathematics) ; Computer Science - Computation and Language ; Natural language processing ; Physics - Quantum Physics ; Quantum computers ; Quantum computing ; Sequences ; Support vector machines</subject><ispartof>arXiv.org, 2024-04</ispartof><rights>2024. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,777,781,882,27906</link.rule.ids><backlink>$$Uhttps://doi.org/10.48550/arXiv.2206.02171$$DView paper in arXiv$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.1007/s10472-024-09940-y$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink></links><search><creatorcontrib>Widdows, Dominic</creatorcontrib><creatorcontrib>Aaranya Alexander</creatorcontrib><creatorcontrib>Zhu, Daiwei</creatorcontrib><creatorcontrib>Zimmerman, Chase</creatorcontrib><creatorcontrib>Majumder, Arunava</creatorcontrib><title>Near-Term Advances in Quantum Natural Language Processing</title><title>arXiv.org</title><description>This paper describes experiments showing that some tasks in natural language processing (NLP) can already be performed using quantum computers, though so far only with small datasets. We demonstrate various approaches to topic classification. The first uses an explicit word-based approach, in which word-topic scoring weights are implemented as fractional rotations of individual qubit, and a new phrase is classified based on the accumulation of these weights in a scoring qubit using entangling controlled-NOT gates. This is compared with more scalable quantum encodings of word embedding vectors, which are used in the computation of kernel values in a quantum support vector machine: this approach achieved an average of 62% accuracy on classification tasks involving over 10000 words, which is the largest such quantum computing experiment to date. We describe a quantum probability approach to bigram modeling that can be applied to sequences of words and formal concepts, investigating a generative approximation to these distributions using a quantum circuit Born machine, and an approach to ambiguity resolution in verb-noun composition using single-qubit rotations for simple nouns and 2-qubit controlled-NOT gates for simple verbs. The smaller systems described have been run successfully on physical quantum computers, and the larger ones have been simulated. We show that statistically meaningful results can be obtained using real datasets, but this is much more difficult to predict than with easier artificial language examples used previously in developing quantum NLP systems. Other approaches to quantum NLP are compared, partly with respect to contemporary issues including informal language, fluency, and truthfulness.</description><subject>Algorithms</subject><subject>Ambiguity resolution (mathematics)</subject><subject>Computer Science - Computation and Language</subject><subject>Natural language processing</subject><subject>Physics - Quantum Physics</subject><subject>Quantum computers</subject><subject>Quantum computing</subject><subject>Sequences</subject><subject>Support vector machines</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2024</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GOX</sourceid><recordid>eNotz19LwzAUh-EgCI65D-CVAa9bk5OkaS7H8B-MqdD7cpqmpWNNZ9oM_fbWzatz83L4PYTccZbKXCn2iOG7O6UALEsZcM2vyAKE4EkuAW7Iahz3jDHINCglFsTsHIakcKGn6_qE3rqRdp5-RvRT7OkOpxjwQLfo24itox9hmJOx8-0tuW7wMLrV_12S4vmp2Lwm2_eXt816m6BRPLEgFedGuMwyK3KptGgyW3MjVa3qqlI5Vto4i2jRNRZBQMMtoLQKGG8qsST3l7dnV3kMXY_hp_zzlWffXDxcimMYvqIbp3I_xODnTeXMFJoZo5n4BQufUnc</recordid><startdate>20240415</startdate><enddate>20240415</enddate><creator>Widdows, Dominic</creator><creator>Aaranya Alexander</creator><creator>Zhu, Daiwei</creator><creator>Zimmerman, Chase</creator><creator>Majumder, Arunava</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20240415</creationdate><title>Near-Term Advances in Quantum Natural Language Processing</title><author>Widdows, Dominic ; Aaranya Alexander ; Zhu, Daiwei ; Zimmerman, Chase ; Majumder, Arunava</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a951-c2451193e6c0c384573f6cd1945d5dbb58ab79ecaacaefca232f1c2a4c5201fb3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2024</creationdate><topic>Algorithms</topic><topic>Ambiguity resolution (mathematics)</topic><topic>Computer Science - Computation and Language</topic><topic>Natural language processing</topic><topic>Physics - Quantum Physics</topic><topic>Quantum computers</topic><topic>Quantum computing</topic><topic>Sequences</topic><topic>Support vector machines</topic><toplevel>online_resources</toplevel><creatorcontrib>Widdows, Dominic</creatorcontrib><creatorcontrib>Aaranya Alexander</creatorcontrib><creatorcontrib>Zhu, Daiwei</creatorcontrib><creatorcontrib>Zimmerman, Chase</creatorcontrib><creatorcontrib>Majumder, Arunava</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Widdows, Dominic</au><au>Aaranya Alexander</au><au>Zhu, Daiwei</au><au>Zimmerman, Chase</au><au>Majumder, Arunava</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Near-Term Advances in Quantum Natural Language Processing</atitle><jtitle>arXiv.org</jtitle><date>2024-04-15</date><risdate>2024</risdate><eissn>2331-8422</eissn><abstract>This paper describes experiments showing that some tasks in natural language processing (NLP) can already be performed using quantum computers, though so far only with small datasets. We demonstrate various approaches to topic classification. The first uses an explicit word-based approach, in which word-topic scoring weights are implemented as fractional rotations of individual qubit, and a new phrase is classified based on the accumulation of these weights in a scoring qubit using entangling controlled-NOT gates. This is compared with more scalable quantum encodings of word embedding vectors, which are used in the computation of kernel values in a quantum support vector machine: this approach achieved an average of 62% accuracy on classification tasks involving over 10000 words, which is the largest such quantum computing experiment to date. We describe a quantum probability approach to bigram modeling that can be applied to sequences of words and formal concepts, investigating a generative approximation to these distributions using a quantum circuit Born machine, and an approach to ambiguity resolution in verb-noun composition using single-qubit rotations for simple nouns and 2-qubit controlled-NOT gates for simple verbs. The smaller systems described have been run successfully on physical quantum computers, and the larger ones have been simulated. We show that statistically meaningful results can be obtained using real datasets, but this is much more difficult to predict than with easier artificial language examples used previously in developing quantum NLP systems. Other approaches to quantum NLP are compared, partly with respect to contemporary issues including informal language, fluency, and truthfulness.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.2206.02171</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2024-04 |
issn | 2331-8422 |
language | eng |
recordid | cdi_arxiv_primary_2206_02171 |
source | arXiv.org; Free E- Journals |
subjects | Algorithms Ambiguity resolution (mathematics) Computer Science - Computation and Language Natural language processing Physics - Quantum Physics Quantum computers Quantum computing Sequences Support vector machines |
title | Near-Term Advances in Quantum Natural Language Processing |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-19T13%3A07%3A30IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Near-Term%20Advances%20in%20Quantum%20Natural%20Language%20Processing&rft.jtitle=arXiv.org&rft.au=Widdows,%20Dominic&rft.date=2024-04-15&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.2206.02171&rft_dat=%3Cproquest_arxiv%3E2673709970%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2673709970&rft_id=info:pmid/&rfr_iscdi=true |