Black Loans Matter: Distributionally Robust Fairness for Fighting Subgroup Discrimination

Algorithmic fairness in lending today relies on group fairness metrics for monitoring statistical parity across protected groups. This approach is vulnerable to subgroup discrimination by proxy, carrying significant risks of legal and reputational damage for lenders and blatantly unfair outcomes for...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:arXiv.org 2020-11
Hauptverfasser: Weber, Mark, Yurochkin, Mikhail, Botros, Sherif, Markov, Vanio
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title arXiv.org
container_volume
creator Weber, Mark
Yurochkin, Mikhail
Botros, Sherif
Markov, Vanio
description Algorithmic fairness in lending today relies on group fairness metrics for monitoring statistical parity across protected groups. This approach is vulnerable to subgroup discrimination by proxy, carrying significant risks of legal and reputational damage for lenders and blatantly unfair outcomes for borrowers. Practical challenges arise from the many possible combinations and subsets of protected groups. We motivate this problem against the backdrop of historical and residual racism in the United States polluting all available training data and raising public sensitivity to algorithimic bias. We review the current regulatory compliance protocols for fairness in lending and discuss their limitations relative to the contributions state-of-the-art fairness methods may afford. We propose a solution for addressing subgroup discrimination, while adhering to existing group fairness requirements, from recent developments in individual fairness methods and corresponding fair metric learning algorithms.
format Article
fullrecord <record><control><sourceid>proquest</sourceid><recordid>TN_cdi_proquest_journals_2466556667</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2466556667</sourcerecordid><originalsourceid>FETCH-proquest_journals_24665566673</originalsourceid><addsrcrecordid>eNqNjL0KwjAYAIMgWLTvEHAu1LRJxdGf4qCLujiVtKT1qzGp-ZLBt1fBB3C65e5GJGJZtkiWOWMTEiP2aZoyUTDOs4hc11o2d3qw0iA9Su-VW9EtoHdQBw_WSK1f9GTrgJ6WEpxRiLS1jpbQ3TyYjp5D3Tkbhm_WOHiAkd9wRsat1KjiH6dkXu4um30yOPsMCn3V2-A-f6xYLgTnQogi-896Aw62QuA</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>2466556667</pqid></control><display><type>article</type><title>Black Loans Matter: Distributionally Robust Fairness for Fighting Subgroup Discrimination</title><source>Free E- Journals</source><creator>Weber, Mark ; Yurochkin, Mikhail ; Botros, Sherif ; Markov, Vanio</creator><creatorcontrib>Weber, Mark ; Yurochkin, Mikhail ; Botros, Sherif ; Markov, Vanio</creatorcontrib><description>Algorithmic fairness in lending today relies on group fairness metrics for monitoring statistical parity across protected groups. This approach is vulnerable to subgroup discrimination by proxy, carrying significant risks of legal and reputational damage for lenders and blatantly unfair outcomes for borrowers. Practical challenges arise from the many possible combinations and subsets of protected groups. We motivate this problem against the backdrop of historical and residual racism in the United States polluting all available training data and raising public sensitivity to algorithimic bias. We review the current regulatory compliance protocols for fairness in lending and discuss their limitations relative to the contributions state-of-the-art fairness methods may afford. We propose a solution for addressing subgroup discrimination, while adhering to existing group fairness requirements, from recent developments in individual fairness methods and corresponding fair metric learning algorithms.</description><identifier>EISSN: 2331-8422</identifier><language>eng</language><publisher>Ithaca: Cornell University Library, arXiv.org</publisher><subject>Algorithms ; Discrimination ; Loans ; Machine learning ; Racism ; Subgroups</subject><ispartof>arXiv.org, 2020-11</ispartof><rights>2020. This work is published under http://creativecommons.org/licenses/by/4.0/ (the “License”). Notwithstanding the ProQuest Terms and Conditions, you may use this content in accordance with the terms of the License.</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>776,780</link.rule.ids></links><search><creatorcontrib>Weber, Mark</creatorcontrib><creatorcontrib>Yurochkin, Mikhail</creatorcontrib><creatorcontrib>Botros, Sherif</creatorcontrib><creatorcontrib>Markov, Vanio</creatorcontrib><title>Black Loans Matter: Distributionally Robust Fairness for Fighting Subgroup Discrimination</title><title>arXiv.org</title><description>Algorithmic fairness in lending today relies on group fairness metrics for monitoring statistical parity across protected groups. This approach is vulnerable to subgroup discrimination by proxy, carrying significant risks of legal and reputational damage for lenders and blatantly unfair outcomes for borrowers. Practical challenges arise from the many possible combinations and subsets of protected groups. We motivate this problem against the backdrop of historical and residual racism in the United States polluting all available training data and raising public sensitivity to algorithimic bias. We review the current regulatory compliance protocols for fairness in lending and discuss their limitations relative to the contributions state-of-the-art fairness methods may afford. We propose a solution for addressing subgroup discrimination, while adhering to existing group fairness requirements, from recent developments in individual fairness methods and corresponding fair metric learning algorithms.</description><subject>Algorithms</subject><subject>Discrimination</subject><subject>Loans</subject><subject>Machine learning</subject><subject>Racism</subject><subject>Subgroups</subject><issn>2331-8422</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2020</creationdate><recordtype>article</recordtype><sourceid>BENPR</sourceid><recordid>eNqNjL0KwjAYAIMgWLTvEHAu1LRJxdGf4qCLujiVtKT1qzGp-ZLBt1fBB3C65e5GJGJZtkiWOWMTEiP2aZoyUTDOs4hc11o2d3qw0iA9Su-VW9EtoHdQBw_WSK1f9GTrgJ6WEpxRiLS1jpbQ3TyYjp5D3Tkbhm_WOHiAkd9wRsat1KjiH6dkXu4um30yOPsMCn3V2-A-f6xYLgTnQogi-896Aw62QuA</recordid><startdate>20201127</startdate><enddate>20201127</enddate><creator>Weber, Mark</creator><creator>Yurochkin, Mikhail</creator><creator>Botros, Sherif</creator><creator>Markov, Vanio</creator><general>Cornell University Library, arXiv.org</general><scope>8FE</scope><scope>8FG</scope><scope>ABJCF</scope><scope>ABUWG</scope><scope>AFKRA</scope><scope>AZQEC</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>HCIFZ</scope><scope>L6V</scope><scope>M7S</scope><scope>PIMPY</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PRINS</scope><scope>PTHSS</scope></search><sort><creationdate>20201127</creationdate><title>Black Loans Matter: Distributionally Robust Fairness for Fighting Subgroup Discrimination</title><author>Weber, Mark ; Yurochkin, Mikhail ; Botros, Sherif ; Markov, Vanio</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-proquest_journals_24665566673</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2020</creationdate><topic>Algorithms</topic><topic>Discrimination</topic><topic>Loans</topic><topic>Machine learning</topic><topic>Racism</topic><topic>Subgroups</topic><toplevel>online_resources</toplevel><creatorcontrib>Weber, Mark</creatorcontrib><creatorcontrib>Yurochkin, Mikhail</creatorcontrib><creatorcontrib>Botros, Sherif</creatorcontrib><creatorcontrib>Markov, Vanio</creatorcontrib><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest Central UK/Ireland</collection><collection>ProQuest Central Essentials</collection><collection>ProQuest Central</collection><collection>Technology Collection</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Engineering Collection</collection><collection>Engineering Database</collection><collection>Publicly Available Content Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>ProQuest Central China</collection><collection>Engineering Collection</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Weber, Mark</au><au>Yurochkin, Mikhail</au><au>Botros, Sherif</au><au>Markov, Vanio</au><format>book</format><genre>document</genre><ristype>GEN</ristype><atitle>Black Loans Matter: Distributionally Robust Fairness for Fighting Subgroup Discrimination</atitle><jtitle>arXiv.org</jtitle><date>2020-11-27</date><risdate>2020</risdate><eissn>2331-8422</eissn><abstract>Algorithmic fairness in lending today relies on group fairness metrics for monitoring statistical parity across protected groups. This approach is vulnerable to subgroup discrimination by proxy, carrying significant risks of legal and reputational damage for lenders and blatantly unfair outcomes for borrowers. Practical challenges arise from the many possible combinations and subsets of protected groups. We motivate this problem against the backdrop of historical and residual racism in the United States polluting all available training data and raising public sensitivity to algorithimic bias. We review the current regulatory compliance protocols for fairness in lending and discuss their limitations relative to the contributions state-of-the-art fairness methods may afford. We propose a solution for addressing subgroup discrimination, while adhering to existing group fairness requirements, from recent developments in individual fairness methods and corresponding fair metric learning algorithms.</abstract><cop>Ithaca</cop><pub>Cornell University Library, arXiv.org</pub><oa>free_for_read</oa></addata></record>
fulltext fulltext
identifier EISSN: 2331-8422
ispartof arXiv.org, 2020-11
issn 2331-8422
language eng
recordid cdi_proquest_journals_2466556667
source Free E- Journals
subjects Algorithms
Discrimination
Loans
Machine learning
Racism
Subgroups
title Black Loans Matter: Distributionally Robust Fairness for Fighting Subgroup Discrimination
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-30T15%3A48%3A51IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest&rft_val_fmt=info:ofi/fmt:kev:mtx:book&rft.genre=document&rft.atitle=Black%20Loans%20Matter:%20Distributionally%20Robust%20Fairness%20for%20Fighting%20Subgroup%20Discrimination&rft.jtitle=arXiv.org&rft.au=Weber,%20Mark&rft.date=2020-11-27&rft.eissn=2331-8422&rft_id=info:doi/&rft_dat=%3Cproquest%3E2466556667%3C/proquest%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=2466556667&rft_id=info:pmid/&rfr_iscdi=true