What to expect from opening up ‘black boxes’? Comparing perceptions of justice between human and automated agents
Advances in artificial intelligence contribute to increasing automation of decisions. In a healthcare-scheduling context, this study compares effects of decision agents and explanations for decisions on decision-recipients’ perceptions of justice. In a 2 (decision agent: automated vs. human) × 3 (ex...
Gespeichert in:
Veröffentlicht in: | Computers in human behavior 2021-09, Vol.122, p.106837, Article 106837 |
---|---|
Hauptverfasser: | , , , , , |
Format: | Artikel |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Advances in artificial intelligence contribute to increasing automation of decisions. In a healthcare-scheduling context, this study compares effects of decision agents and explanations for decisions on decision-recipients’ perceptions of justice. In a 2 (decision agent: automated vs. human) × 3 (explanation: no explanation vs. equality-explanation vs. equity-explanation) between-subjects online study, 209 healthcare professionals were asked to put themselves in a situation where their vacation request was denied by either a human or an automated agent. Participants either received no explanation or an explanation based on equality or equity norms. Perceptions of interpersonal justice were stronger for the human agent. Additionally, participants perceived human agents as offering more voice and automated agents as being more consistent in decision-making. When given no explanation, perceptions of informational justice were impaired only for the human decision agent. In the study's second part, participants took the perspective of a decision-maker and were given the choice to delegate decision-making to an automated system. Participants who delegated an unpleasant decision to the system frequently externalized responsibility and showed different response patterns when confronted by a decision-recipient who asked for a rationale for the decision.
•Interpersonal justice perceptions were higher when a human agent made the decision.•Explanations by human agents led to higher informational justice perceptions.•Explanations by automated agents did not change informational justice perceptions.•Applicability of justice theory to human-automation interaction is discussed.•Delegation of decisions to automated systems affected the decision rationale. |
---|---|
ISSN: | 0747-5632 1873-7692 |
DOI: | 10.1016/j.chb.2021.106837 |