Reconciling Predictive and Statistical Parity: A Causal Approach
Since the rise of fair machine learning as a critical field of inquiry, many different notions on how to quantify and measure discrimination have been proposed in the literature. Some of these notions, however, were shown to be mutually incompatible. Such findings make it appear that numerous differ...
Gespeichert in:
Hauptverfasser: | , |
---|---|
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext bestellen |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Since the rise of fair machine learning as a critical field of inquiry, many
different notions on how to quantify and measure discrimination have been
proposed in the literature. Some of these notions, however, were shown to be
mutually incompatible. Such findings make it appear that numerous different
kinds of fairness exist, thereby making a consensus on the appropriate measure
of fairness harder to reach, hindering the applications of these tools in
practice. In this paper, we investigate one of these key impossibility results
that relates the notions of statistical and predictive parity. Specifically, we
derive a new causal decomposition formula for the fairness measures associated
with predictive parity, and obtain a novel insight into how this criterion is
related to statistical parity through the legal doctrines of disparate
treatment, disparate impact, and the notion of business necessity. Our results
show that through a more careful causal analysis, the notions of statistical
and predictive parity are not really mutually exclusive, but complementary and
spanning a spectrum of fairness notions through the concept of business
necessity. Finally, we demonstrate the importance of our findings on a
real-world example. |
---|---|
DOI: | 10.48550/arxiv.2306.05059 |