Neural attention model for recommendation based on factorization machines

In recommendation systems, it is of vital importance to comprehensively consider various aspects of information to make accurate recommendations for users. When the low-order feature interactions between items are insufficient, it is necessary to mine information to learn higher-order feature intera...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Applied intelligence (Dordrecht, Netherlands) Netherlands), 2021-04, Vol.51 (4), p.1829-1844
Hauptverfasser: Wen, Peng, Yuan, Weihua, Qin, Qianqian, Sang, Sheng, Zhang, Zhijun
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 1844
container_issue 4
container_start_page 1829
container_title Applied intelligence (Dordrecht, Netherlands)
container_volume 51
creator Wen, Peng
Yuan, Weihua
Qin, Qianqian
Sang, Sheng
Zhang, Zhijun
description In recommendation systems, it is of vital importance to comprehensively consider various aspects of information to make accurate recommendations for users. When the low-order feature interactions between items are insufficient, it is necessary to mine information to learn higher-order feature interactions. In addition, to distinguish the different importance levels of feature interactions, larger weights should be assigned to features with larger contributions to predictions, and smaller weights to those with smaller contributions. Therefore, this paper proposes a neural attention model for recommendation (NAM), which deepens factorization machines (FMs) by adding an attention mechanism and fully connected layers. Through the attention mechanism, NAM can learn the different importance levels of low-order feature interactions. By adding fully connected layers on top of the attention component, NAM can model high-order feature interactions in a nonlinear way. Experiments on two real-world datasets demonstrate that NAM has excellent performance and is superior to FM and other state-of-the-art models. The results demonstrate the effectiveness of the proposed model and the potential of using neural networks for prediction under sparse data.
doi_str_mv 10.1007/s10489-020-01921-y
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_journals_2509913922</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2509913922</sourcerecordid><originalsourceid>FETCH-LOGICAL-c319t-d7445f8e0c1bca91d377e7cde8b2d0c8255025eff6c425ff5ac6497a89d9618d3</originalsourceid><addsrcrecordid>eNp9kE9PwzAMxSMEEmPwBThV4hxwkqZpjmjiz6QJLiBxi7LEgU5rM5L2MD49ZUXixsmW_d6z_CPkksE1A1A3mUFZawocKDDNGd0fkRmTSlBVanVMZqB5SatKv52Ss5w3ACAEsBlZPuGQ7LawfY9d38SuaKPHbRFiKhK62LbYeXtYrG1GX4xNsK6Pqfmaxq11H02H-ZycBLvNePFb5-T1_u5l8UhXzw_Lxe2KOsF0T70qSxlqBMfWzmrmhVKonMd6zT24mksJXGIIlSu5DEFaV40v2Fp7XbHaizm5mnJ3KX4OmHuziUPqxpOGS9CaCc35qOKTyqWYc8JgdqlpbdobBuYHmZmQmRGZOSAz-9EkJlMexd07pr_of1zfGspwXQ</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2509913922</pqid></control><display><type>article</type><title>Neural attention model for recommendation based on factorization machines</title><source>SpringerLink Journals - AutoHoldings</source><creator>Wen, Peng ; Yuan, Weihua ; Qin, Qianqian ; Sang, Sheng ; Zhang, Zhijun</creator><creatorcontrib>Wen, Peng ; Yuan, Weihua ; Qin, Qianqian ; Sang, Sheng ; Zhang, Zhijun</creatorcontrib><description>In recommendation systems, it is of vital importance to comprehensively consider various aspects of information to make accurate recommendations for users. When the low-order feature interactions between items are insufficient, it is necessary to mine information to learn higher-order feature interactions. In addition, to distinguish the different importance levels of feature interactions, larger weights should be assigned to features with larger contributions to predictions, and smaller weights to those with smaller contributions. Therefore, this paper proposes a neural attention model for recommendation (NAM), which deepens factorization machines (FMs) by adding an attention mechanism and fully connected layers. Through the attention mechanism, NAM can learn the different importance levels of low-order feature interactions. By adding fully connected layers on top of the attention component, NAM can model high-order feature interactions in a nonlinear way. Experiments on two real-world datasets demonstrate that NAM has excellent performance and is superior to FM and other state-of-the-art models. The results demonstrate the effectiveness of the proposed model and the potential of using neural networks for prediction under sparse data.</description><identifier>ISSN: 0924-669X</identifier><identifier>EISSN: 1573-7497</identifier><identifier>DOI: 10.1007/s10489-020-01921-y</identifier><language>eng</language><publisher>New York: Springer US</publisher><subject>Artificial Intelligence ; Computer Science ; Factorization ; Machines ; Manufacturing ; Mechanical Engineering ; Neural networks ; Processes ; Recommender systems</subject><ispartof>Applied intelligence (Dordrecht, Netherlands), 2021-04, Vol.51 (4), p.1829-1844</ispartof><rights>Springer Science+Business Media, LLC, part of Springer Nature 2020</rights><rights>Springer Science+Business Media, LLC, part of Springer Nature 2020.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c319t-d7445f8e0c1bca91d377e7cde8b2d0c8255025eff6c425ff5ac6497a89d9618d3</citedby><cites>FETCH-LOGICAL-c319t-d7445f8e0c1bca91d377e7cde8b2d0c8255025eff6c425ff5ac6497a89d9618d3</cites><orcidid>0000-0002-9532-7158</orcidid></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s10489-020-01921-y$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s10489-020-01921-y$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,780,784,27922,27923,41486,42555,51317</link.rule.ids></links><search><creatorcontrib>Wen, Peng</creatorcontrib><creatorcontrib>Yuan, Weihua</creatorcontrib><creatorcontrib>Qin, Qianqian</creatorcontrib><creatorcontrib>Sang, Sheng</creatorcontrib><creatorcontrib>Zhang, Zhijun</creatorcontrib><title>Neural attention model for recommendation based on factorization machines</title><title>Applied intelligence (Dordrecht, Netherlands)</title><addtitle>Appl Intell</addtitle><description>In recommendation systems, it is of vital importance to comprehensively consider various aspects of information to make accurate recommendations for users. When the low-order feature interactions between items are insufficient, it is necessary to mine information to learn higher-order feature interactions. In addition, to distinguish the different importance levels of feature interactions, larger weights should be assigned to features with larger contributions to predictions, and smaller weights to those with smaller contributions. Therefore, this paper proposes a neural attention model for recommendation (NAM), which deepens factorization machines (FMs) by adding an attention mechanism and fully connected layers. Through the attention mechanism, NAM can learn the different importance levels of low-order feature interactions. By adding fully connected layers on top of the attention component, NAM can model high-order feature interactions in a nonlinear way. Experiments on two real-world datasets demonstrate that NAM has excellent performance and is superior to FM and other state-of-the-art models. The results demonstrate the effectiveness of the proposed model and the potential of using neural networks for prediction under sparse data.</description><subject>Artificial Intelligence</subject><subject>Computer Science</subject><subject>Factorization</subject><subject>Machines</subject><subject>Manufacturing</subject><subject>Mechanical Engineering</subject><subject>Neural networks</subject><subject>Processes</subject><subject>Recommender systems</subject><issn>0924-669X</issn><issn>1573-7497</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2021</creationdate><recordtype>article</recordtype><sourceid>ABUWG</sourceid><sourceid>AFKRA</sourceid><sourceid>AZQEC</sourceid><sourceid>BENPR</sourceid><sourceid>CCPQU</sourceid><sourceid>DWQXO</sourceid><sourceid>GNUQQ</sourceid><recordid>eNp9kE9PwzAMxSMEEmPwBThV4hxwkqZpjmjiz6QJLiBxi7LEgU5rM5L2MD49ZUXixsmW_d6z_CPkksE1A1A3mUFZawocKDDNGd0fkRmTSlBVanVMZqB5SatKv52Ss5w3ACAEsBlZPuGQ7LawfY9d38SuaKPHbRFiKhK62LbYeXtYrG1GX4xNsK6Pqfmaxq11H02H-ZycBLvNePFb5-T1_u5l8UhXzw_Lxe2KOsF0T70qSxlqBMfWzmrmhVKonMd6zT24mksJXGIIlSu5DEFaV40v2Fp7XbHaizm5mnJ3KX4OmHuziUPqxpOGS9CaCc35qOKTyqWYc8JgdqlpbdobBuYHmZmQmRGZOSAz-9EkJlMexd07pr_of1zfGspwXQ</recordid><startdate>20210401</startdate><enddate>20210401</enddate><creator>Wen, Peng</creator><creator>Yuan, Weihua</creator><creator>Qin, Qianqian</creator><creator>Sang, Sheng</creator><creator>Zhang, Zhijun</creator><general>Springer US</general><general>Springer Nature B.V</general><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7SC</scope><scope>7WY</scope><scope>7WZ</scope><scope>7XB</scope><scope>87Z</scope><scope>8AL</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FK</scope><scope>8FL</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>ARAPS</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BEZIV</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FRNLG</scope><scope>F~G</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>JQ2</scope><scope>K60</scope><scope>K6~</scope><scope>K7-</scope><scope>L.-</scope><scope>L6V</scope><scope>L7M</scope><scope>L~C</scope><scope>L~D</scope><scope>M0C</scope><scope>M0N</scope><scope>M7S</scope><scope>P5Z</scope><scope>P62</scope><scope>PQBIZ</scope><scope>PQBZA</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PSYQQ</scope><scope>PTHSS</scope><scope>Q9U</scope><orcidid>https://orcid.org/0000-0002-9532-7158</orcidid></search><sort><creationdate>20210401</creationdate><title>Neural attention model for recommendation based on factorization machines</title><author>Wen, Peng ; Yuan, Weihua ; Qin, Qianqian ; Sang, Sheng ; Zhang, Zhijun</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c319t-d7445f8e0c1bca91d377e7cde8b2d0c8255025eff6c425ff5ac6497a89d9618d3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2021</creationdate><topic>Artificial Intelligence</topic><topic>Computer Science</topic><topic>Factorization</topic><topic>Machines</topic><topic>Manufacturing</topic><topic>Mechanical Engineering</topic><topic>Neural networks</topic><topic>Processes</topic><topic>Recommender systems</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Wen, Peng</creatorcontrib><creatorcontrib>Yuan, Weihua</creatorcontrib><creatorcontrib>Qin, Qianqian</creatorcontrib><creatorcontrib>Sang, Sheng</creatorcontrib><creatorcontrib>Zhang, Zhijun</creatorcontrib><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Computer and Information Systems Abstracts</collection><collection>ABI/INFORM Collection</collection><collection>ABI/INFORM Global (PDF only)</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>ABI/INFORM Global (Alumni Edition)</collection><collection>Computing Database (Alumni Edition)</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>ABI/INFORM Collection (Alumni Edition)</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Business Premium Collection</collection><collection>Technology Collection (ProQuest)</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Business Premium Collection (Alumni)</collection><collection>ABI/INFORM Global (Corporate)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Computer Science Collection</collection><collection>ProQuest Business Collection (Alumni Edition)</collection><collection>ProQuest Business Collection</collection><collection>Computer Science Database</collection><collection>ABI/INFORM Professional Advanced</collection><collection>ProQuest Engineering Collection</collection><collection>Advanced Technologies Database with Aerospace</collection><collection>Computer and Information Systems Abstracts – Academic</collection><collection>Computer and Information Systems Abstracts Professional</collection><collection>ABI/INFORM Global</collection><collection>Computing Database</collection><collection>Engineering Database</collection><collection>Advanced Technologies &amp; Aerospace Database</collection><collection>ProQuest Advanced Technologies &amp; Aerospace Collection</collection><collection>ProQuest One Business</collection><collection>ProQuest One Business (Alumni)</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>ProQuest One Psychology</collection><collection>Engineering Collection</collection><collection>ProQuest Central Basic</collection><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Wen, Peng</au><au>Yuan, Weihua</au><au>Qin, Qianqian</au><au>Sang, Sheng</au><au>Zhang, Zhijun</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Neural attention model for recommendation based on factorization machines</atitle><jtitle>Applied intelligence (Dordrecht, Netherlands)</jtitle><stitle>Appl Intell</stitle><date>2021-04-01</date><risdate>2021</risdate><volume>51</volume><issue>4</issue><spage>1829</spage><epage>1844</epage><pages>1829-1844</pages><issn>0924-669X</issn><eissn>1573-7497</eissn><abstract>In recommendation systems, it is of vital importance to comprehensively consider various aspects of information to make accurate recommendations for users. When the low-order feature interactions between items are insufficient, it is necessary to mine information to learn higher-order feature interactions. In addition, to distinguish the different importance levels of feature interactions, larger weights should be assigned to features with larger contributions to predictions, and smaller weights to those with smaller contributions. Therefore, this paper proposes a neural attention model for recommendation (NAM), which deepens factorization machines (FMs) by adding an attention mechanism and fully connected layers. Through the attention mechanism, NAM can learn the different importance levels of low-order feature interactions. By adding fully connected layers on top of the attention component, NAM can model high-order feature interactions in a nonlinear way. Experiments on two real-world datasets demonstrate that NAM has excellent performance and is superior to FM and other state-of-the-art models. The results demonstrate the effectiveness of the proposed model and the potential of using neural networks for prediction under sparse data.</abstract><cop>New York</cop><pub>Springer US</pub><doi>10.1007/s10489-020-01921-y</doi><tpages>16</tpages><orcidid>https://orcid.org/0000-0002-9532-7158</orcidid></addata></record>
fulltext fulltext
identifier ISSN: 0924-669X
ispartof Applied intelligence (Dordrecht, Netherlands), 2021-04, Vol.51 (4), p.1829-1844
issn 0924-669X
1573-7497
language eng
recordid cdi_proquest_journals_2509913922
source SpringerLink Journals - AutoHoldings
subjects Artificial Intelligence
Computer Science
Factorization
Machines
Manufacturing
Mechanical Engineering
Neural networks
Processes
Recommender systems
title Neural attention model for recommendation based on factorization machines
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-14T13%3A28%3A55IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Neural%20attention%20model%20for%20recommendation%20based%20on%20factorization%20machines&rft.jtitle=Applied%20intelligence%20(Dordrecht,%20Netherlands)&rft.au=Wen,%20Peng&rft.date=2021-04-01&rft.volume=51&rft.issue=4&rft.spage=1829&rft.epage=1844&rft.pages=1829-1844&rft.issn=0924-669X&rft.eissn=1573-7497&rft_id=info:doi/10.1007/s10489-020-01921-y&rft_dat=%3Cproquest_cross%3E2509913922%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2509913922&rft_id=info:pmid/&rfr_iscdi=true