Equal Experience in Recommender Systems

We explore the fairness issue that arises in recommender systems. Biased data due to inherent stereotypes of particular groups (e.g., male students' average rating on mathematics is often higher than that on humanities, and vice versa for females) may yield a limited scope of suggested items to...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Cho, Jaewoong, Choi, Moonseok, Suh, Changho
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Cho, Jaewoong
Choi, Moonseok
Suh, Changho
description We explore the fairness issue that arises in recommender systems. Biased data due to inherent stereotypes of particular groups (e.g., male students' average rating on mathematics is often higher than that on humanities, and vice versa for females) may yield a limited scope of suggested items to a certain group of users. Our main contribution lies in the introduction of a novel fairness notion (that we call equal experience), which can serve to regulate such unfairness in the presence of biased data. The notion captures the degree of the equal experience of item recommendations across distinct groups. We propose an optimization framework that incorporates the fairness notion as a regularization term, as well as introduce computationally-efficient algorithms that solve the optimization. Experiments on synthetic and benchmark real datasets demonstrate that the proposed framework can indeed mitigate such unfairness while exhibiting a minor degradation of recommendation accuracy.
doi_str_mv 10.48550/arxiv.2210.05936
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2210_05936</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2210_05936</sourcerecordid><originalsourceid>FETCH-LOGICAL-a676-b691750150cf3643d732ffd1fd9cb6f210969a42c77e9e1eb8b1c832b61ed6143</originalsourceid><addsrcrecordid>eNotzjkLwkAUBOBtLET9AVams4rukX2bLUXiAYKg9mGPtxAwUTcq-u89q4EpZj5ChoxOslxKOjXxUd0nnL8LKrWALhkXl5s5JsXjjLHCxmFSNckO3amusfEYk_2zvWLd9kknmGOLg3_2yGFRHOardLNdruezTWpAQWpBMyUpk9QFAZnwSvAQPAteOwvh_atBm4w7pVAjQ5tb5nLBLTD0wDLRI6Pf7FdanmNVm_gsP-LyKxYvc_s6Pg</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Equal Experience in Recommender Systems</title><source>arXiv.org</source><creator>Cho, Jaewoong ; Choi, Moonseok ; Suh, Changho</creator><creatorcontrib>Cho, Jaewoong ; Choi, Moonseok ; Suh, Changho</creatorcontrib><description>We explore the fairness issue that arises in recommender systems. Biased data due to inherent stereotypes of particular groups (e.g., male students' average rating on mathematics is often higher than that on humanities, and vice versa for females) may yield a limited scope of suggested items to a certain group of users. Our main contribution lies in the introduction of a novel fairness notion (that we call equal experience), which can serve to regulate such unfairness in the presence of biased data. The notion captures the degree of the equal experience of item recommendations across distinct groups. We propose an optimization framework that incorporates the fairness notion as a regularization term, as well as introduce computationally-efficient algorithms that solve the optimization. Experiments on synthetic and benchmark real datasets demonstrate that the proposed framework can indeed mitigate such unfairness while exhibiting a minor degradation of recommendation accuracy.</description><identifier>DOI: 10.48550/arxiv.2210.05936</identifier><language>eng</language><subject>Computer Science - Information Retrieval ; Computer Science - Learning ; Statistics - Machine Learning</subject><creationdate>2022-10</creationdate><rights>http://arxiv.org/licenses/nonexclusive-distrib/1.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,780,885</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2210.05936$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2210.05936$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Cho, Jaewoong</creatorcontrib><creatorcontrib>Choi, Moonseok</creatorcontrib><creatorcontrib>Suh, Changho</creatorcontrib><title>Equal Experience in Recommender Systems</title><description>We explore the fairness issue that arises in recommender systems. Biased data due to inherent stereotypes of particular groups (e.g., male students' average rating on mathematics is often higher than that on humanities, and vice versa for females) may yield a limited scope of suggested items to a certain group of users. Our main contribution lies in the introduction of a novel fairness notion (that we call equal experience), which can serve to regulate such unfairness in the presence of biased data. The notion captures the degree of the equal experience of item recommendations across distinct groups. We propose an optimization framework that incorporates the fairness notion as a regularization term, as well as introduce computationally-efficient algorithms that solve the optimization. Experiments on synthetic and benchmark real datasets demonstrate that the proposed framework can indeed mitigate such unfairness while exhibiting a minor degradation of recommendation accuracy.</description><subject>Computer Science - Information Retrieval</subject><subject>Computer Science - Learning</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2022</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotzjkLwkAUBOBtLET9AVams4rukX2bLUXiAYKg9mGPtxAwUTcq-u89q4EpZj5ChoxOslxKOjXxUd0nnL8LKrWALhkXl5s5JsXjjLHCxmFSNckO3amusfEYk_2zvWLd9kknmGOLg3_2yGFRHOardLNdruezTWpAQWpBMyUpk9QFAZnwSvAQPAteOwvh_atBm4w7pVAjQ5tb5nLBLTD0wDLRI6Pf7FdanmNVm_gsP-LyKxYvc_s6Pg</recordid><startdate>20221012</startdate><enddate>20221012</enddate><creator>Cho, Jaewoong</creator><creator>Choi, Moonseok</creator><creator>Suh, Changho</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20221012</creationdate><title>Equal Experience in Recommender Systems</title><author>Cho, Jaewoong ; Choi, Moonseok ; Suh, Changho</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a676-b691750150cf3643d732ffd1fd9cb6f210969a42c77e9e1eb8b1c832b61ed6143</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2022</creationdate><topic>Computer Science - Information Retrieval</topic><topic>Computer Science - Learning</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Cho, Jaewoong</creatorcontrib><creatorcontrib>Choi, Moonseok</creatorcontrib><creatorcontrib>Suh, Changho</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Cho, Jaewoong</au><au>Choi, Moonseok</au><au>Suh, Changho</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Equal Experience in Recommender Systems</atitle><date>2022-10-12</date><risdate>2022</risdate><abstract>We explore the fairness issue that arises in recommender systems. Biased data due to inherent stereotypes of particular groups (e.g., male students' average rating on mathematics is often higher than that on humanities, and vice versa for females) may yield a limited scope of suggested items to a certain group of users. Our main contribution lies in the introduction of a novel fairness notion (that we call equal experience), which can serve to regulate such unfairness in the presence of biased data. The notion captures the degree of the equal experience of item recommendations across distinct groups. We propose an optimization framework that incorporates the fairness notion as a regularization term, as well as introduce computationally-efficient algorithms that solve the optimization. Experiments on synthetic and benchmark real datasets demonstrate that the proposed framework can indeed mitigate such unfairness while exhibiting a minor degradation of recommendation accuracy.</abstract><doi>10.48550/arxiv.2210.05936</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2210.05936
ispartof
issn
language eng
recordid cdi_arxiv_primary_2210_05936
source arXiv.org
subjects Computer Science - Information Retrieval
Computer Science - Learning
Statistics - Machine Learning
title Equal Experience in Recommender Systems
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-11T22%3A08%3A24IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Equal%20Experience%20in%20Recommender%20Systems&rft.au=Cho,%20Jaewoong&rft.date=2022-10-12&rft_id=info:doi/10.48550/arxiv.2210.05936&rft_dat=%3Carxiv_GOX%3E2210_05936%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true