Deeper Text Understanding for IR with Contextual Neural Language Modeling
Neural networks provide new possibilities to automatically learn complex language patterns and query-document relations. Neural IR models have achieved promising results in learning query-document relevance patterns, but few explorations have been done on understanding the text content of a query or...
Gespeichert in:
Veröffentlicht in: | arXiv.org 2019-05 |
---|---|
Hauptverfasser: | , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
container_end_page | |
---|---|
container_issue | |
container_start_page | |
container_title | arXiv.org |
container_volume | |
creator | Dai, Zhuyun Callan, Jamie |
description | Neural networks provide new possibilities to automatically learn complex language patterns and query-document relations. Neural IR models have achieved promising results in learning query-document relevance patterns, but few explorations have been done on understanding the text content of a query or a document. This paper studies leveraging a recently-proposed contextual neural language model, BERT, to provide deeper text understanding for IR. Experimental results demonstrate that the contextual text representations from BERT are more effective than traditional word embeddings. Compared to bag-of-words retrieval models, the contextual language model can better leverage language structures, bringing large improvements on queries written in natural languages. Combining the text understanding ability with search knowledge leads to an enhanced pre-trained BERT model that can benefit related search tasks where training data are limited. |
doi_str_mv | 10.48550/arxiv.1905.09217 |
format | Article |
fullrecord | <record><control><sourceid>proquest_arxiv</sourceid><recordid>TN_cdi_arxiv_primary_1905_09217</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2229501477</sourcerecordid><originalsourceid>FETCH-LOGICAL-a527-7c1c61df4e6b514913b820da888fe73d2c4d74e83cef363ac4ae6f23ed5583df3</originalsourceid><addsrcrecordid>eNotj81KxDAURoMgOIzzAK4MuG5NbpImXUr9G6gKMq5LprmpHWpb01bHt7fOuDqbw8d3CLngLJZGKXZtw77-innKVMxS4PqELEAIHhkJcEZWw7BjjEGiQSmxIOtbxB4D3eB-pG-twzCMtnV1W1HfBbp-pd_1-E6zrh1nY7INfcYpzMhtW022QvrUOWxm_5ycetsMuPrnkmzu7zbZY5S_PKyzmzyyCnSkS14m3HmJyVZxmXKxNcCcNcZ41MJBKZ2WaESJXiTCltJi4kGgU8oI58WSXB5nD51FH-oPG36Kv97i0DsbV0ejD93nhMNY7LoptPOnAgBSxbjUWvwCbIFYQg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2229501477</pqid></control><display><type>article</type><title>Deeper Text Understanding for IR with Contextual Neural Language Modeling</title><source>arXiv.org</source><source>Free E- Journals</source><creator>Dai, Zhuyun ; Callan, Jamie</creator><creatorcontrib>Dai, Zhuyun ; Callan, Jamie</creatorcontrib><description>Neural networks provide new possibilities to automatically learn complex language patterns and query-document relations. Neural IR models have achieved promising results in learning query-document relevance patterns, but few explorations have been done on understanding the text content of a query or a document. This paper studies leveraging a recently-proposed contextual neural language model, BERT, to provide deeper text understanding for IR. Experimental results demonstrate that the contextual text representations from BERT are more effective than traditional word embeddings. Compared to bag-of-words retrieval models, the contextual language model can better leverage language structures, bringing large improvements on queries written in natural languages. Combining the text understanding ability with search knowledge leads to an enhanced pre-trained BERT model that can benefit related search tasks where training data are limited.</description><identifier>EISSN: 2331-8422</identifier><identifier>DOI: 10.48550/arxiv.1905.09217</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Computer Science - Computation and Language ; Computer Science - Information Retrieval ; Language ; Neural networks ; Queries ; Query languages</subject><ispartof>arXiv.org, 2019-05</ispartof><rights>2019. This work is published under http://arxiv.org/licenses/nonexclusive-distrib/1.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,778,782,883,27908</link.rule.ids><backlink>$$Uhttps://doi.org/10.48550/arXiv.1905.09217$$DView paper in arXiv$$Hfree_for_read</backlink><backlink>$$Uhttps://doi.org/10.1145/3331184.3331303$$DView published paper (Access to full text may be restricted)$$Hfree_for_read</backlink></links><search><creatorcontrib>Dai, Zhuyun</creatorcontrib><creatorcontrib>Callan, Jamie</creatorcontrib><title>Deeper Text Understanding for IR with Contextual Neural Language Modeling</title><title>arXiv.org</title><description>Neural networks provide new possibilities to automatically learn complex language patterns and query-document relations. Neural IR models have achieved promising results in learning query-document relevance patterns, but few explorations have been done on understanding the text content of a query or a document. This paper studies leveraging a recently-proposed contextual neural language model, BERT, to provide deeper text understanding for IR. Experimental results demonstrate that the contextual text representations from BERT are more effective than traditional word embeddings. Compared to bag-of-words retrieval models, the contextual language model can better leverage language structures, bringing large improvements on queries written in natural languages. Combining the text understanding ability with search knowledge leads to an enhanced pre-trained BERT model that can benefit related search tasks where training data are limited.</description><subject>Computer Science - Computation and Language</subject><subject>Computer Science - Information Retrieval</subject><subject>Language</subject><subject>Neural networks</subject><subject>Queries</subject><subject>Query languages</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2019</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GOX</sourceid><recordid>eNotj81KxDAURoMgOIzzAK4MuG5NbpImXUr9G6gKMq5LprmpHWpb01bHt7fOuDqbw8d3CLngLJZGKXZtw77-innKVMxS4PqELEAIHhkJcEZWw7BjjEGiQSmxIOtbxB4D3eB-pG-twzCMtnV1W1HfBbp-pd_1-E6zrh1nY7INfcYpzMhtW022QvrUOWxm_5ycetsMuPrnkmzu7zbZY5S_PKyzmzyyCnSkS14m3HmJyVZxmXKxNcCcNcZ41MJBKZ2WaESJXiTCltJi4kGgU8oI58WSXB5nD51FH-oPG36Kv97i0DsbV0ejD93nhMNY7LoptPOnAgBSxbjUWvwCbIFYQg</recordid><startdate>20190522</startdate><enddate>20190522</enddate><creator>Dai, Zhuyun</creator><creator>Callan, Jamie</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20190522</creationdate><title>Deeper Text Understanding for IR with Contextual Neural Language Modeling</title><author>Dai, Zhuyun ; Callan, Jamie</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a527-7c1c61df4e6b514913b820da888fe73d2c4d74e83cef363ac4ae6f23ed5583df3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2019</creationdate><topic>Computer Science - Computation and Language</topic><topic>Computer Science - Information Retrieval</topic><topic>Language</topic><topic>Neural networks</topic><topic>Queries</topic><topic>Query languages</topic><toplevel>online_resources</toplevel><creatorcontrib>Dai, Zhuyun</creatorcontrib><creatorcontrib>Callan, Jamie</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science & Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection><collection>arXiv Computer Science</collection><collection>arXiv.org</collection><jtitle>arXiv.org</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Dai, Zhuyun</au><au>Callan, Jamie</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Deeper Text Understanding for IR with Contextual Neural Language Modeling</atitle><jtitle>arXiv.org</jtitle><date>2019-05-22</date><risdate>2019</risdate><eissn>2331-8422</eissn><abstract>Neural networks provide new possibilities to automatically learn complex language patterns and query-document relations. Neural IR models have achieved promising results in learning query-document relevance patterns, but few explorations have been done on understanding the text content of a query or a document. This paper studies leveraging a recently-proposed contextual neural language model, BERT, to provide deeper text understanding for IR. Experimental results demonstrate that the contextual text representations from BERT are more effective than traditional word embeddings. Compared to bag-of-words retrieval models, the contextual language model can better leverage language structures, bringing large improvements on queries written in natural languages. Combining the text understanding ability with search knowledge leads to an enhanced pre-trained BERT model that can benefit related search tasks where training data are limited.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><doi>10.48550/arxiv.1905.09217</doi><oa>free_for_read</oa></addata></record> |
fulltext | fulltext |
identifier | EISSN: 2331-8422 |
ispartof | arXiv.org, 2019-05 |
issn | 2331-8422 |
language | eng |
recordid | cdi_arxiv_primary_1905_09217 |
source | arXiv.org; Free E- Journals |
subjects | Computer Science - Computation and Language Computer Science - Information Retrieval Language Neural networks Queries Query languages |
title | Deeper Text Understanding for IR with Contextual Neural Language Modeling |
url | https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-16T20%3A02%3A05IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_arxiv&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Deeper%20Text%20Understanding%20for%20IR%20with%20Contextual%20Neural%20Language%20Modeling&rft.jtitle=arXiv.org&rft.au=Dai,%20Zhuyun&rft.date=2019-05-22&rft.eissn=2331-8422&rft_id=info:doi/10.48550/arxiv.1905.09217&rft_dat=%3Cproquest_arxiv%3E2229501477%3C/proquest_arxiv%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2229501477&rft_id=info:pmid/&rfr_iscdi=true |