Design Requirements for a Moral Machine for Autonomous Weapons
Autonomous Weapon Systems (AWS) are said to become the third revolution in warfare. These systems raise many questions and concerns that demand in-depth research on ethical and moral responsibility. Ethical decision-making is studied in related fields like Autonomous Vehicles and Human Operated dron...
Gespeichert in:
Hauptverfasser: | , , |
---|---|
Format: | Tagungsbericht |
Sprache: | eng |
Schlagworte: | |
Online-Zugang: | Volltext |
Tags: |
Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
|
Zusammenfassung: | Autonomous Weapon Systems (AWS) are said to become the third revolution in warfare. These systems raise many questions and concerns that demand in-depth research on ethical and moral responsibility. Ethical decision-making is studied in related fields like Autonomous Vehicles and Human Operated drones, but not yet fully extended to the deployment of AWS and research on moral judgement is lacking. In this paper, we propose design requirements for a Moral Machine (Similar to http://moralmachine.mit.edu/) for Autonomous Weapons to conduct a large-scale study of the moral judgement of people regarding the deployment of this type of weapon. We ran an online survey to get a first impression on the importance of six variables that will be implemented in a proof-of-concept of a Moral Machine for Autonomous Weapons and describe a scenario containing these six variables. The platform will enable large-scale randomized controlled experiments and generate knowledge about people’s feelings concerning this type of weapons. The next steps of our study include development and testing of the design before the prototype is upscaled to a Massive Online Experiment. |
---|---|
ISSN: | 0302-9743 1611-3349 |
DOI: | 10.1007/978-3-319-99229-7_44 |