Reconciling Predictive and Statistical Parity: A Causal Approach

Since the rise of fair machine learning as a critical field of inquiry, many different notions on how to quantify and measure discrimination have been proposed in the literature. Some of these notions, however, were shown to be mutually incompatible. Such findings make it appear that numerous differ...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Hauptverfasser: Plecko, Drago, Bareinboim, Elias
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext bestellen
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page
container_issue
container_start_page
container_title
container_volume
creator Plecko, Drago
Bareinboim, Elias
description Since the rise of fair machine learning as a critical field of inquiry, many different notions on how to quantify and measure discrimination have been proposed in the literature. Some of these notions, however, were shown to be mutually incompatible. Such findings make it appear that numerous different kinds of fairness exist, thereby making a consensus on the appropriate measure of fairness harder to reach, hindering the applications of these tools in practice. In this paper, we investigate one of these key impossibility results that relates the notions of statistical and predictive parity. Specifically, we derive a new causal decomposition formula for the fairness measures associated with predictive parity, and obtain a novel insight into how this criterion is related to statistical parity through the legal doctrines of disparate treatment, disparate impact, and the notion of business necessity. Our results show that through a more careful causal analysis, the notions of statistical and predictive parity are not really mutually exclusive, but complementary and spanning a spectrum of fairness notions through the concept of business necessity. Finally, we demonstrate the importance of our findings on a real-world example.
doi_str_mv 10.48550/arxiv.2306.05059
format Article
fullrecord <record><control><sourceid>arxiv_GOX</sourceid><recordid>TN_cdi_arxiv_primary_2306_05059</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>2306_05059</sourcerecordid><originalsourceid>FETCH-LOGICAL-a679-98a0d774f3b81ef90c988c79a13384b38b2cfe42dea85a1876fec8aa277bbb733</originalsourceid><addsrcrecordid>eNotz71OwzAUhmEvHVDhApjwDSQ4cRwfMxFF_EmVqKB7dHxig6U0jRxT0btvKZ0-6R0-6WHsthB5BUqJe4y_YZ-XUtS5UEKZK_b44Wg3UhjC-MXX0fWBUtg7jmPPPxOmMKdAOPA1xpAOD7zhLf7Mp9BMU9whfV-zhcdhdjeXXbLN89Omfc1W7y9vbbPKsNYmM4Ci17ry0kLhvBFkAEgbLKSEykqwJXlXlb1DUFiArr0jQCy1ttZqKZfs7v_2TOimGLYYD90fpTtT5BHxGkPT</addsrcrecordid><sourcetype>Open Access Repository</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype></control><display><type>article</type><title>Reconciling Predictive and Statistical Parity: A Causal Approach</title><source>arXiv.org</source><creator>Plecko, Drago ; Bareinboim, Elias</creator><creatorcontrib>Plecko, Drago ; Bareinboim, Elias</creatorcontrib><description>Since the rise of fair machine learning as a critical field of inquiry, many different notions on how to quantify and measure discrimination have been proposed in the literature. Some of these notions, however, were shown to be mutually incompatible. Such findings make it appear that numerous different kinds of fairness exist, thereby making a consensus on the appropriate measure of fairness harder to reach, hindering the applications of these tools in practice. In this paper, we investigate one of these key impossibility results that relates the notions of statistical and predictive parity. Specifically, we derive a new causal decomposition formula for the fairness measures associated with predictive parity, and obtain a novel insight into how this criterion is related to statistical parity through the legal doctrines of disparate treatment, disparate impact, and the notion of business necessity. Our results show that through a more careful causal analysis, the notions of statistical and predictive parity are not really mutually exclusive, but complementary and spanning a spectrum of fairness notions through the concept of business necessity. Finally, we demonstrate the importance of our findings on a real-world example.</description><identifier>DOI: 10.48550/arxiv.2306.05059</identifier><language>eng</language><subject>Computer Science - Artificial Intelligence ; Computer Science - Computers and Society ; Computer Science - Learning ; Statistics - Machine Learning</subject><creationdate>2023-06</creationdate><rights>http://creativecommons.org/licenses/by/4.0</rights><oa>free_for_read</oa><woscitedreferencessubscribed>false</woscitedreferencessubscribed></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><link.rule.ids>228,230,777,882</link.rule.ids><linktorsrc>$$Uhttps://arxiv.org/abs/2306.05059$$EView_record_in_Cornell_University$$FView_record_in_$$GCornell_University$$Hfree_for_read</linktorsrc><backlink>$$Uhttps://doi.org/10.48550/arXiv.2306.05059$$DView paper in arXiv$$Hfree_for_read</backlink></links><search><creatorcontrib>Plecko, Drago</creatorcontrib><creatorcontrib>Bareinboim, Elias</creatorcontrib><title>Reconciling Predictive and Statistical Parity: A Causal Approach</title><description>Since the rise of fair machine learning as a critical field of inquiry, many different notions on how to quantify and measure discrimination have been proposed in the literature. Some of these notions, however, were shown to be mutually incompatible. Such findings make it appear that numerous different kinds of fairness exist, thereby making a consensus on the appropriate measure of fairness harder to reach, hindering the applications of these tools in practice. In this paper, we investigate one of these key impossibility results that relates the notions of statistical and predictive parity. Specifically, we derive a new causal decomposition formula for the fairness measures associated with predictive parity, and obtain a novel insight into how this criterion is related to statistical parity through the legal doctrines of disparate treatment, disparate impact, and the notion of business necessity. Our results show that through a more careful causal analysis, the notions of statistical and predictive parity are not really mutually exclusive, but complementary and spanning a spectrum of fairness notions through the concept of business necessity. Finally, we demonstrate the importance of our findings on a real-world example.</description><subject>Computer Science - Artificial Intelligence</subject><subject>Computer Science - Computers and Society</subject><subject>Computer Science - Learning</subject><subject>Statistics - Machine Learning</subject><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2023</creationdate><recordtype>article</recordtype><sourceid>GOX</sourceid><recordid>eNotz71OwzAUhmEvHVDhApjwDSQ4cRwfMxFF_EmVqKB7dHxig6U0jRxT0btvKZ0-6R0-6WHsthB5BUqJe4y_YZ-XUtS5UEKZK_b44Wg3UhjC-MXX0fWBUtg7jmPPPxOmMKdAOPA1xpAOD7zhLf7Mp9BMU9whfV-zhcdhdjeXXbLN89Omfc1W7y9vbbPKsNYmM4Ci17ry0kLhvBFkAEgbLKSEykqwJXlXlb1DUFiArr0jQCy1ttZqKZfs7v_2TOimGLYYD90fpTtT5BHxGkPT</recordid><startdate>20230608</startdate><enddate>20230608</enddate><creator>Plecko, Drago</creator><creator>Bareinboim, Elias</creator><scope>AKY</scope><scope>EPD</scope><scope>GOX</scope></search><sort><creationdate>20230608</creationdate><title>Reconciling Predictive and Statistical Parity: A Causal Approach</title><author>Plecko, Drago ; Bareinboim, Elias</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-a679-98a0d774f3b81ef90c988c79a13384b38b2cfe42dea85a1876fec8aa277bbb733</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2023</creationdate><topic>Computer Science - Artificial Intelligence</topic><topic>Computer Science - Computers and Society</topic><topic>Computer Science - Learning</topic><topic>Statistics - Machine Learning</topic><toplevel>online_resources</toplevel><creatorcontrib>Plecko, Drago</creatorcontrib><creatorcontrib>Bareinboim, Elias</creatorcontrib><collection>arXiv Computer Science</collection><collection>arXiv Statistics</collection><collection>arXiv.org</collection></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext_linktorsrc</fulltext></delivery><addata><au>Plecko, Drago</au><au>Bareinboim, Elias</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>Reconciling Predictive and Statistical Parity: A Causal Approach</atitle><date>2023-06-08</date><risdate>2023</risdate><abstract>Since the rise of fair machine learning as a critical field of inquiry, many different notions on how to quantify and measure discrimination have been proposed in the literature. Some of these notions, however, were shown to be mutually incompatible. Such findings make it appear that numerous different kinds of fairness exist, thereby making a consensus on the appropriate measure of fairness harder to reach, hindering the applications of these tools in practice. In this paper, we investigate one of these key impossibility results that relates the notions of statistical and predictive parity. Specifically, we derive a new causal decomposition formula for the fairness measures associated with predictive parity, and obtain a novel insight into how this criterion is related to statistical parity through the legal doctrines of disparate treatment, disparate impact, and the notion of business necessity. Our results show that through a more careful causal analysis, the notions of statistical and predictive parity are not really mutually exclusive, but complementary and spanning a spectrum of fairness notions through the concept of business necessity. Finally, we demonstrate the importance of our findings on a real-world example.</abstract><doi>10.48550/arxiv.2306.05059</doi><oa>free_for_read</oa></addata></record>
fulltext fulltext_linktorsrc
identifier DOI: 10.48550/arxiv.2306.05059
ispartof
issn
language eng
recordid cdi_arxiv_primary_2306_05059
source arXiv.org
subjects Computer Science - Artificial Intelligence
Computer Science - Computers and Society
Computer Science - Learning
Statistics - Machine Learning
title Reconciling Predictive and Statistical Parity: A Causal Approach
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-01-20T09%3A42%3A20IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-arxiv_GOX&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=Reconciling%20Predictive%20and%20Statistical%20Parity:%20A%20Causal%20Approach&rft.au=Plecko,%20Drago&rft.date=2023-06-08&rft_id=info:doi/10.48550/arxiv.2306.05059&rft_dat=%3Carxiv_GOX%3E2306_05059%3C/arxiv_GOX%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_id=info:pmid/&rfr_iscdi=true