DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction

Click-Through Rate (CTR) prediction is a pivotal task in product and content recommendation, where learning effective feature embeddings is of great significance. However, traditional methods typically learn fixed feature representations without dynamically refining feature representations according...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Zhu, Chen, Du, Liang, Chen, Hong, Zhao, Shuang, Sun, Zixun, Wang, Xin, Zhu, Wenwu
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Zhu, Chen
Du, Liang
Chen, Hong
Zhao, Shuang
Sun, Zixun
Wang, Xin
Zhu, Wenwu
description Click-Through Rate (CTR) prediction is a pivotal task in product and content recommendation, where learning effective feature embeddings is of great significance. However, traditional methods typically learn fixed feature representations without dynamically refining feature representations according to the context information, leading to suboptimal performance. Some recent approaches attempt to address this issue by learning bit-wise weights or augmented embeddings for feature representations, but suffer from uninformative or redundant features in the context. To tackle this problem, inspired by the Global Workspace Theory in conscious processing, which posits that only a specific subset of the product features are pertinent while the rest can be noisy and even detrimental to human-click behaviors, we propose a CTR model that enables Dynamic Embedding Learning with Truncated Conscious Attention for CTR prediction, termed DELTA. DELTA contains two key components: (I) conscious truncation module (CTM), which utilizes curriculum learning to apply adaptive truncation on attention weights to select the most critical feature in the context; (II) explicit embedding optimization (EEO), which applies an auxiliary task during training that directly and independently propagates the gradient from the loss layer to the embedding layer, thereby optimizing the embedding explicitly via linear feature crossing. Extensive experiments on five challenging CTR datasets demonstrate that DELTA achieves new state-of-art performance among current CTR methods.
doi_str_mv 10.48550/arxiv.2305.04891
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2305_04891</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2305_04891</sourcerecordid><originalsourceid>FETCH-LOGICAL-a671-c46926698a8c2d94287fab9758de6fe1dacc39d9d8f90c5386dc1dc317dccac33</originalsourceid><addsrcrecordid>eNotz81OhDAUBeBuXJjRB3A1fQGQUiitO8LgT0KiMSSzJHfuLdpEiikddd7eMLo6J2dxko-xG5GlhS7L7BbCj_tKc5mVaVZoIy7Zftd2fX3HdycPk0PeTgdL5Pwb7ywEv5ZvF995H44eIVrizewXdPNx4XWM1kc3ez7OgTf9K38Jlhyu0xW7GOFjsdf_uWH9fds3j0n3_PDU1F0CqhIJFsrkShkNGnMyRa6rEQ6mKjVZNVpBgCgNGdKjybCUWhEKQikqQgSUcsO2f7dn2fAZ3AThNKzC4SyUvx0KTC0</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction</title><source>arXiv.org</source><creator>Zhu, Chen ; Du, Liang ; Chen, Hong ; Zhao, Shuang ; Sun, Zixun ; Wang, Xin ; Zhu, Wenwu</creator><creatorcontrib>Zhu, Chen ; Du, Liang ; Chen, Hong ; Zhao, Shuang ; Sun, Zixun ; Wang, Xin ; Zhu, Wenwu</creatorcontrib><description>Click-Through Rate (CTR) prediction is a pivotal task in product and content recommendation, where learning effective feature embeddings is of great significance. However, traditional methods typically learn fixed feature representations without dynamically refining feature representations according to the context information, leading to suboptimal performance. Some recent approaches attempt to address this issue by learning bit-wise weights or augmented embeddings for feature representations, but suffer from uninformative or redundant features in the context. To tackle this problem, inspired by the Global Workspace Theory in conscious processing, which posits that only a specific subset of the product features are pertinent while the rest can be noisy and even detrimental to human-click behaviors, we propose a CTR model that enables Dynamic Embedding Learning with Truncated Conscious Attention for CTR prediction, termed DELTA. DELTA contains two key components: (I) conscious truncation module (CTM), which utilizes curriculum learning to apply adaptive truncation on attention weights to select the most critical feature in the context; (II) explicit embedding optimization (EEO), which applies an auxiliary task during training that directly and independently propagates the gradient from the loss layer to the embedding layer, thereby optimizing the embedding explicitly via linear feature crossing. Extensive experiments on five challenging CTR datasets demonstrate that DELTA achieves new state-of-art performance among current CTR methods.</description><identifier>DOI: 10.48550/arxiv.2305.04891</identifier><language>eng</language><subject>Computer Science - Information Retrieval ; Computer Science - Learning</subject><creationdate>2023-05</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2305.04891$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2305.04891$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Zhu, Chen</creatorcontrib><creatorcontrib>Du, Liang</creatorcontrib><creatorcontrib>Chen, Hong</creatorcontrib><creatorcontrib>Zhao, Shuang</creatorcontrib><creatorcontrib>Sun, Zixun</creatorcontrib><creatorcontrib>Wang, Xin</creatorcontrib><creatorcontrib>Zhu, Wenwu</creatorcontrib><title>DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction</title><description>Click-Through Rate (CTR) prediction is a pivotal task in product and content recommendation, where learning effective feature embeddings is of great significance. However, traditional methods typically learn fixed feature representations without dynamically refining feature representations according to the context information, leading to suboptimal performance. Some recent approaches attempt to address this issue by learning bit-wise weights or augmented embeddings for feature representations, but suffer from uninformative or redundant features in the context. To tackle this problem, inspired by the Global Workspace Theory in conscious processing, which posits that only a specific subset of the product features are pertinent while the rest can be noisy and even detrimental to human-click behaviors, we propose a CTR model that enables Dynamic Embedding Learning with Truncated Conscious Attention for CTR prediction, termed DELTA. DELTA contains two key components: (I) conscious truncation module (CTM), which utilizes curriculum learning to apply adaptive truncation on attention weights to select the most critical feature in the context; (II) explicit embedding optimization (EEO), which applies an auxiliary task during training that directly and independently propagates the gradient from the loss layer to the embedding layer, thereby optimizing the embedding explicitly via linear feature crossing. Extensive experiments on five challenging CTR datasets demonstrate that DELTA achieves new state-of-art performance among current CTR methods.</description><subject>Computer Science - Information Retrieval</subject><subject>Computer Science - Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotz81OhDAUBeBuXJjRB3A1fQGQUiitO8LgT0KiMSSzJHfuLdpEiikddd7eMLo6J2dxko-xG5GlhS7L7BbCj_tKc5mVaVZoIy7Zftd2fX3HdycPk0PeTgdL5Pwb7ywEv5ZvF995H44eIVrizewXdPNx4XWM1kc3ez7OgTf9K38Jlhyu0xW7GOFjsdf_uWH9fds3j0n3_PDU1F0CqhIJFsrkShkNGnMyRa6rEQ6mKjVZNVpBgCgNGdKjybCUWhEKQikqQgSUcsO2f7dn2fAZ3AThNKzC4SyUvx0KTC0</recordid><startdate>20230503</startdate><enddate>20230503</enddate><creator>Zhu, Chen</creator><creator>Du, Liang</creator><creator>Chen, Hong</creator><creator>Zhao, Shuang</creator><creator>Sun, Zixun</creator><creator>Wang, Xin</creator><creator>Zhu, Wenwu</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20230503</creationdate><title>DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction</title><author>Zhu, Chen ; Du, Liang ; Chen, Hong ; Zhao, Shuang ; Sun, Zixun ; Wang, Xin ; Zhu, Wenwu</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a671-c46926698a8c2d94287fab9758de6fe1dacc39d9d8f90c5386dc1dc317dccac33</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Computer Science - Information Retrieval</topic><topic>Computer Science - Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Zhu, Chen</creatorcontrib><creatorcontrib>Du, Liang</creatorcontrib><creatorcontrib>Chen, Hong</creatorcontrib><creatorcontrib>Zhao, Shuang</creatorcontrib><creatorcontrib>Sun, Zixun</creatorcontrib><creatorcontrib>Wang, Xin</creatorcontrib><creatorcontrib>Zhu, Wenwu</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Zhu, Chen</au><au>Du, Liang</au><au>Chen, Hong</au><au>Zhao, Shuang</au><au>Sun, Zixun</au><au>Wang, Xin</au><au>Zhu, Wenwu</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction</atitle><date>2023-05-03</date><risdate>2023</risdate><abstract>Click-Through Rate (CTR) prediction is a pivotal task in product and content recommendation, where learning effective feature embeddings is of great significance. However, traditional methods typically learn fixed feature representations without dynamically refining feature representations according to the context information, leading to suboptimal performance. Some recent approaches attempt to address this issue by learning bit-wise weights or augmented embeddings for feature representations, but suffer from uninformative or redundant features in the context. To tackle this problem, inspired by the Global Workspace Theory in conscious processing, which posits that only a specific subset of the product features are pertinent while the rest can be noisy and even detrimental to human-click behaviors, we propose a CTR model that enables Dynamic Embedding Learning with Truncated Conscious Attention for CTR prediction, termed DELTA. DELTA contains two key components: (I) conscious truncation module (CTM), which utilizes curriculum learning to apply adaptive truncation on attention weights to select the most critical feature in the context; (II) explicit embedding optimization (EEO), which applies an auxiliary task during training that directly and independently propagates the gradient from the loss layer to the embedding layer, thereby optimizing the embedding explicitly via linear feature crossing. Extensive experiments on five challenging CTR datasets demonstrate that DELTA achieves new state-of-art performance among current CTR methods.</abstract><doi>10.48550/arxiv.2305.04891</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2305.04891
ispartof
issn
language eng
recordid cdi_arxiv_primary_2305_04891
source arXiv.org
subjects Computer Science - Information Retrieval
Computer Science - Learning
title DELTA: Dynamic Embedding Learning with Truncated Conscious Attention for CTR Prediction
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2024-12-29T08%3A59%3A31IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=DELTA:%20Dynamic%20Embedding%20Learning%20with%20Truncated%20Conscious%20Attention%20for%20CTR%20Prediction&rft.au=Zhu,%20Chen&rft.date=2023-05-03&rft_id=info:doi/10.48550/arxiv.2305.04891&rft_dat=%3Carxiv_GOX%3E2305_04891%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true