Human beings and robots: are there any differences in the attribution of punishments for the same crimes?

As collaborative robots and artificial intelligence (AI) systems are being deployed in ever-increasing contexts, we are more and more called upon to make judgements on their moral behaviour. Understanding the factors, affecting our ethical judgements involving these types of agents, would thus seem...

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Behaviour & information technology 2021-04, Vol.40 (5), p.445-453
Hauptverfasser: Guidi, Stefano, Marchigiani, Enrica, Roncato, Sergio, Parlangeli, Oronzo
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
Beschreibung
Zusammenfassung:As collaborative robots and artificial intelligence (AI) systems are being deployed in ever-increasing contexts, we are more and more called upon to make judgements on their moral behaviour. Understanding the factors, affecting our ethical judgements involving these types of agents, would thus seem of the uttermost importance to allow for safer and well-regulated interactions between humans and machines. So far, however, this topic has been rarely investigated. We compared the perception of the seriousness of an action committed by either a person or a robot, causing harm to either some persons or some robots, and the attribution of the appropriated punishment for that action. The results showed a significant effect of the type of victim: the action was considered more a serious offence, and deemed worthy of more severe punishment, if the victims were humans than if they were robots. A significant agent-by-victim interaction was also found in the punishment judgements: for human victims, a human agent was punished more severely than a robot, while for robot victims, a robot agent was attributed a more severe punishment than a human one. The results are discussed in the light of the theories linking moral judgements to mind perception.
ISSN:0144-929X
1362-3001
DOI:10.1080/0144929X.2021.1905879