Role of fairness, accountability, and transparency in algorithmic affordance

As algorithm-based services increase, social topics such as fairness, transparency, and accountability (FAT) must be addressed. This study conceptualizes such issues and examines how they influence the use and adoption of algorithm services. In particular, we investigate how trust is related to such...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Computers in human behavior 2019-09, Vol.98, p.277-284
Hauptverfasser: Shin, Donghee, Park, Yong Jin
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:As algorithm-based services increase, social topics such as fairness, transparency, and accountability (FAT) must be addressed. This study conceptualizes such issues and examines how they influence the use and adoption of algorithm services. In particular, we investigate how trust is related to such issues and how trust influences the user experience of algorithm services. A multi-mixed method was used by integrating interpretive methods and surveys. The overall results show the heuristic role of fairness, accountability, and transparency, regarding their fundamental links to trust. Despite the importance of algorithms, no single testable definition has been observed. We reconstructed the understandings of algorithm and its affordance with user perception, invariant properties, and contextuality. The study concludes by arguing that algorithmic affordance offers a distinctive perspective on the conceptualization of algorithmic process. Individuals’ perceptions of FAT and how they actually perceive them are important topics for further study. •Conceptualizes fairness, transparency, and accountability.•How they influence the use and adoption of algorithm services.•How trust is related to such issues and how trust influences the user experience of algorithm services.
ISSN:0747-5632
1873-7692
DOI:10.1016/j.chb.2019.04.019