Investigating the Legality of Bias Mitigation Methods in the United Kingdom
Algorithmic Decision-Making Systems (ADMS) 1 fairness issues have been well highlighted over the past decade [1] , including some facial recognition systems struggling to identify people of color [2] . In 2021, Uber drivers filed a claim with the U.K. ’s employment tribunal for unfair dismissal resu...
Gespeichert in:
Veröffentlicht in: | IEEE technology & society magazine 2023-12, Vol.42 (4), p.87-94 |
---|---|
Hauptverfasser: | , , , , , , |
Format: | Magazinearticle |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Algorithmic Decision-Making Systems (ADMS) 1 fairness issues have been well highlighted over the past decade [1] , including some facial recognition systems struggling to identify people of color [2] . In 2021, Uber drivers filed a claim with the U.K. ’s employment tribunal for unfair dismissal resulting from automated facial recognition technology by Microsoft [3] . Bias mitigation methods have been developed to reduce discrimination from ADMS. These typically operationalize fairness notions as fairness metrics to minimize discrimination [4] . We refer to ADMS to which bias mitigation methods have been applied as “mitigated ADMS” or, in the singular, a “mitigated system.” |
---|---|
ISSN: | 0278-0097 1937-416X |
DOI: | 10.1109/MTS.2023.3341465 |