Contrastive Information Transfer for Pre-Ranking Systems

Real-word search and recommender systems usually adopt a multi-stage ranking architecture, including matching, pre-ranking, ranking, and re-ranking. Previous works mainly focus on the ranking stage while very few focus on the pre-ranking stage. In this paper, we focus on the information transfer fro...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Cao, Yue, Zhou, XiaoJiang, Huang, Peihao, Xiao, Yao, Chen, Dayao, Chen, Sheng
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Cao, Yue
Zhou, XiaoJiang
Huang, Peihao
Xiao, Yao
Chen, Dayao
Chen, Sheng
description Real-word search and recommender systems usually adopt a multi-stage ranking architecture, including matching, pre-ranking, ranking, and re-ranking. Previous works mainly focus on the ranking stage while very few focus on the pre-ranking stage. In this paper, we focus on the information transfer from ranking to pre-ranking stage. We propose a new Contrastive Information Transfer (CIT) framework to transfer useful information from ranking model to pre-ranking model. We train the pre-ranking model to distinguish the positive pair of representation from a set of positive and negative pairs with a contrastive objective. As a consequence, the pre-ranking model can make full use of rich information in ranking model's representations. The CIT framework also has the advantage of alleviating selection bias and improving the performance of recall metrics, which is crucial for pre-ranking models. We conduct extensive experiments including offline datasets and online A/B testing. Experimental results show that CIT achieves superior results than competitive models. In addition, a strict online A/B testing at one of the world's largest E-commercial platforms shows that the proposed model achieves 0.63\% improvements on CTR and 1.64\% improvements on VBR. The proposed model now has been deployed online and serves the main traffic of this system, contributing a remarkable business growth.
doi_str_mv 10.48550/arxiv.2207.03073
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2207_03073</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2207_03073</sourcerecordid><originalsourceid>FETCH-LOGICAL-a673-dd747260b9b29b89bff774656df4b1e37df00aca28f4eb96a013963ea58a80af3</originalsourceid><addsrcrecordid>eNotj71OwzAURr0woMIDMOEXSLixHf-MKCq0UiUqyB5dE19kQZzKjir69pTCdKRvOPoOY3cN1Mq2LTxg_o7HWggwNUgw8prZbk5LxrLEY-DbRHOecIlz4n3GVChkfp74PofqFdNnTB_87VSWMJUbdkX4VcLtP1esf1r33abavTxvu8ddhdrIahyNMkKDd144b50nMkbpVo-kfBOkGQkA31FYUsE7jdBIp2XA1qIFJLli93_ay_XhkOOE-TT8JgyXBPkD43pBdw</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Contrastive Information Transfer for Pre-Ranking Systems</title><source>arXiv.org</source><creator>Cao, Yue ; Zhou, XiaoJiang ; Huang, Peihao ; Xiao, Yao ; Chen, Dayao ; Chen, Sheng</creator><creatorcontrib>Cao, Yue ; Zhou, XiaoJiang ; Huang, Peihao ; Xiao, Yao ; Chen, Dayao ; Chen, Sheng</creatorcontrib><description>Real-word search and recommender systems usually adopt a multi-stage ranking architecture, including matching, pre-ranking, ranking, and re-ranking. Previous works mainly focus on the ranking stage while very few focus on the pre-ranking stage. In this paper, we focus on the information transfer from ranking to pre-ranking stage. We propose a new Contrastive Information Transfer (CIT) framework to transfer useful information from ranking model to pre-ranking model. We train the pre-ranking model to distinguish the positive pair of representation from a set of positive and negative pairs with a contrastive objective. As a consequence, the pre-ranking model can make full use of rich information in ranking model's representations. The CIT framework also has the advantage of alleviating selection bias and improving the performance of recall metrics, which is crucial for pre-ranking models. We conduct extensive experiments including offline datasets and online A/B testing. Experimental results show that CIT achieves superior results than competitive models. In addition, a strict online A/B testing at one of the world's largest E-commercial platforms shows that the proposed model achieves 0.63\% improvements on CTR and 1.64\% improvements on VBR. The proposed model now has been deployed online and serves the main traffic of this system, contributing a remarkable business growth.</description><identifier>DOI: 10.48550/arxiv.2207.03073</identifier><language>eng</language><subject>Computer Science - Information Retrieval</subject><creationdate>2022-07</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,778,883</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2207.03073$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2207.03073$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Cao, Yue</creatorcontrib><creatorcontrib>Zhou, XiaoJiang</creatorcontrib><creatorcontrib>Huang, Peihao</creatorcontrib><creatorcontrib>Xiao, Yao</creatorcontrib><creatorcontrib>Chen, Dayao</creatorcontrib><creatorcontrib>Chen, Sheng</creatorcontrib><title>Contrastive Information Transfer for Pre-Ranking Systems</title><description>Real-word search and recommender systems usually adopt a multi-stage ranking architecture, including matching, pre-ranking, ranking, and re-ranking. Previous works mainly focus on the ranking stage while very few focus on the pre-ranking stage. In this paper, we focus on the information transfer from ranking to pre-ranking stage. We propose a new Contrastive Information Transfer (CIT) framework to transfer useful information from ranking model to pre-ranking model. We train the pre-ranking model to distinguish the positive pair of representation from a set of positive and negative pairs with a contrastive objective. As a consequence, the pre-ranking model can make full use of rich information in ranking model's representations. The CIT framework also has the advantage of alleviating selection bias and improving the performance of recall metrics, which is crucial for pre-ranking models. We conduct extensive experiments including offline datasets and online A/B testing. Experimental results show that CIT achieves superior results than competitive models. In addition, a strict online A/B testing at one of the world's largest E-commercial platforms shows that the proposed model achieves 0.63\% improvements on CTR and 1.64\% improvements on VBR. The proposed model now has been deployed online and serves the main traffic of this system, contributing a remarkable business growth.</description><subject>Computer Science - Information Retrieval</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotj71OwzAURr0woMIDMOEXSLixHf-MKCq0UiUqyB5dE19kQZzKjir69pTCdKRvOPoOY3cN1Mq2LTxg_o7HWggwNUgw8prZbk5LxrLEY-DbRHOecIlz4n3GVChkfp74PofqFdNnTB_87VSWMJUbdkX4VcLtP1esf1r33abavTxvu8ddhdrIahyNMkKDd144b50nMkbpVo-kfBOkGQkA31FYUsE7jdBIp2XA1qIFJLli93_ay_XhkOOE-TT8JgyXBPkD43pBdw</recordid><startdate>20220706</startdate><enddate>20220706</enddate><creator>Cao, Yue</creator><creator>Zhou, XiaoJiang</creator><creator>Huang, Peihao</creator><creator>Xiao, Yao</creator><creator>Chen, Dayao</creator><creator>Chen, Sheng</creator><scope>AKY</scope><scope>GOX</scope></search><sort><creationdate>20220706</creationdate><title>Contrastive Information Transfer for Pre-Ranking Systems</title><author>Cao, Yue ; Zhou, XiaoJiang ; Huang, Peihao ; Xiao, Yao ; Chen, Dayao ; Chen, Sheng</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a673-dd747260b9b29b89bff774656df4b1e37df00aca28f4eb96a013963ea58a80af3</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Information Retrieval</topic><toplevel>online_resources</toplevel><creatorcontrib>Cao, Yue</creatorcontrib><creatorcontrib>Zhou, XiaoJiang</creatorcontrib><creatorcontrib>Huang, Peihao</creatorcontrib><creatorcontrib>Xiao, Yao</creatorcontrib><creatorcontrib>Chen, Dayao</creatorcontrib><creatorcontrib>Chen, Sheng</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Cao, Yue</au><au>Zhou, XiaoJiang</au><au>Huang, Peihao</au><au>Xiao, Yao</au><au>Chen, Dayao</au><au>Chen, Sheng</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Contrastive Information Transfer for Pre-Ranking Systems</atitle><date>2022-07-06</date><risdate>2022</risdate><abstract>Real-word search and recommender systems usually adopt a multi-stage ranking architecture, including matching, pre-ranking, ranking, and re-ranking. Previous works mainly focus on the ranking stage while very few focus on the pre-ranking stage. In this paper, we focus on the information transfer from ranking to pre-ranking stage. We propose a new Contrastive Information Transfer (CIT) framework to transfer useful information from ranking model to pre-ranking model. We train the pre-ranking model to distinguish the positive pair of representation from a set of positive and negative pairs with a contrastive objective. As a consequence, the pre-ranking model can make full use of rich information in ranking model's representations. The CIT framework also has the advantage of alleviating selection bias and improving the performance of recall metrics, which is crucial for pre-ranking models. We conduct extensive experiments including offline datasets and online A/B testing. Experimental results show that CIT achieves superior results than competitive models. In addition, a strict online A/B testing at one of the world's largest E-commercial platforms shows that the proposed model achieves 0.63\% improvements on CTR and 1.64\% improvements on VBR. The proposed model now has been deployed online and serves the main traffic of this system, contributing a remarkable business growth.</abstract><doi>10.48550/arxiv.2207.03073</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2207.03073
ispartof
issn
language eng
recordid cdi_arxiv_primary_2207_03073
source arXiv.org
subjects Computer Science - Information Retrieval
title Contrastive Information Transfer for Pre-Ranking Systems
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-17T00%3A35%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Contrastive%20Information%20Transfer%20for%20Pre-Ranking%20Systems&rft.au=Cao,%20Yue&rft.date=2022-07-06&rft_id=info:doi/10.48550/arxiv.2207.03073&rft_dat=%3Carxiv_GOX%3E2207_03073%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true