This “Ethical Trap” Is for Roboticists, Not Robots: On the Issue of Artificial Agent Ethical Decision-Making

In this paper we address the question of when a researcher is justified in describing his or her artificial agent as demonstrating ethical decision-making. The paper is motivated by the amount of research being done that attempts to imbue artificial agents with expertise in ethical decision-making....

Ausführliche Beschreibung

Gespeichert in:
Bibliographische Detailangaben
Veröffentlicht in:Science and engineering ethics 2017-04, Vol.23 (2), p.389-401
Hauptverfasser: Miller, Keith W., Wolf, Marty J., Grodzinsky, Frances
Format: Artikel
Sprache:eng
Schlagworte:
Online-Zugang:Volltext
Tags: Tag hinzufügen
Keine Tags, Fügen Sie den ersten Tag hinzu!
container_end_page 401
container_issue 2
container_start_page 389
container_title Science and engineering ethics
container_volume 23
creator Miller, Keith W.
Wolf, Marty J.
Grodzinsky, Frances
description In this paper we address the question of when a researcher is justified in describing his or her artificial agent as demonstrating ethical decision-making. The paper is motivated by the amount of research being done that attempts to imbue artificial agents with expertise in ethical decision-making. It seems clear that computing systems make decisions, in that they make choices between different options; and there is scholarship in philosophy that addresses the distinction between ethical decision-making and general decision-making. Essentially, the qualitative difference between ethical decisions and general decisions is that ethical decisions must be part of the process of developing ethical expertise within an agent. We use this distinction in examining publicity surrounding a particular experiment in which a simulated robot attempted to safeguard simulated humans from falling into a hole. We conclude that any suggestions that this simulated robot was making ethical decisions were misleading.
doi_str_mv 10.1007/s11948-016-9785-y
format Article
fullrecord <record><control><sourceid>proquest_cross</sourceid><recordid>TN_cdi_proquest_miscellaneous_1909685250</recordid><sourceformat>XML</sourceformat><sourcesystem>PC</sourcesystem><sourcerecordid>1909685250</sourcerecordid><originalsourceid>FETCH-LOGICAL-c438t-49c9156419a8b586f2370c6caaa30a5c86056d1a02c83e7aa9a8538d9fe795963</originalsourceid><addsrcrecordid>eNqNkcFuVCEUhkmjaWv1AboxJG5ciB7gwoXuJrVqk2oTM64Jw3BnaGcuI3AXs-uD6Mv1SWRy28aYNHXFCef7P0J-hI4pvKcA7YdMqW4UASqJbpUg2z10SJuWEiEa-azOXHDCG8EO0IucrwCYUI3cRwespVQC14doM12GjG9vfp2VZXB2hafJbm5vfuPzjLuY8Pc4iyW4kEt-h7_FMl7kE3zZ47L0FcuDx7HDk1RCV8GqmCx8X_C98KOv6RB78tVeh37xEj3v7Cr7V3fnEfrx6Wx6-oVcXH4-P51cENdwVUijnaZCNlRbNRNKdoy34KSz1nKwwikJQs6pBeYU9621lRNczXXnWy205Efo7ejdpPhz8LmYdcjOr1a293HIhmrQUgkm4GlUaa4B5P9YFZNSKs126Jt_0Ks4pL7-uVIKONdKsUrRkXIp5px8ZzYprG3aGgpmV7IZSza1ZLMr2Wxr5vWdeZit_fwhcd9qBdgI5LrqFz799fSj1j9UObF1</addsrcrecordid><sourcetype>Aggregation Database</sourcetype><iscdi>true</iscdi><recordtype>article</recordtype><pqid>1880339882</pqid></control><display><type>article</type><title>This “Ethical Trap” Is for Roboticists, Not Robots: On the Issue of Artificial Agent Ethical Decision-Making</title><source>MEDLINE</source><source>SpringerLink Journals - AutoHoldings</source><creator>Miller, Keith W. ; Wolf, Marty J. ; Grodzinsky, Frances</creator><creatorcontrib>Miller, Keith W. ; Wolf, Marty J. ; Grodzinsky, Frances</creatorcontrib><description>In this paper we address the question of when a researcher is justified in describing his or her artificial agent as demonstrating ethical decision-making. The paper is motivated by the amount of research being done that attempts to imbue artificial agents with expertise in ethical decision-making. It seems clear that computing systems make decisions, in that they make choices between different options; and there is scholarship in philosophy that addresses the distinction between ethical decision-making and general decision-making. Essentially, the qualitative difference between ethical decisions and general decisions is that ethical decisions must be part of the process of developing ethical expertise within an agent. We use this distinction in examining publicity surrounding a particular experiment in which a simulated robot attempted to safeguard simulated humans from falling into a hole. We conclude that any suggestions that this simulated robot was making ethical decisions were misleading.</description><identifier>ISSN: 1353-3452</identifier><identifier>EISSN: 1471-5546</identifier><identifier>DOI: 10.1007/s11948-016-9785-y</identifier><identifier>PMID: 27116039</identifier><language>eng</language><publisher>Dordrecht: Springer Netherlands</publisher><subject>Agents (artificial intelligence) ; Bioethics ; Biomedical Engineering and Bioengineering ; Computer simulation ; Decision making ; Decision Making - ethics ; Decisions ; Education ; Engineering ; Ethics ; Falling ; Humans ; Medicine/Public Health ; Original Paper ; Philosophy ; Philosophy of Science ; Researchers ; Robotics ; Robotics - ethics ; Robotics - standards ; Robots</subject><ispartof>Science and engineering ethics, 2017-04, Vol.23 (2), p.389-401</ispartof><rights>Springer Science+Business Media Dordrecht 2016</rights><rights>Science and Engineering Ethics is a copyright of Springer, 2017.</rights><lds50>peer_reviewed</lds50><woscitedreferencessubscribed>false</woscitedreferencessubscribed><citedby>FETCH-LOGICAL-c438t-49c9156419a8b586f2370c6caaa30a5c86056d1a02c83e7aa9a8538d9fe795963</citedby><cites>FETCH-LOGICAL-c438t-49c9156419a8b586f2370c6caaa30a5c86056d1a02c83e7aa9a8538d9fe795963</cites></display><links><openurl>$$Topenurl_article</openurl><openurlfulltext>$$Topenurlfull_article</openurlfulltext><thumbnail>$$Tsyndetics_thumb_exl</thumbnail><linktopdf>$$Uhttps://link.springer.com/content/pdf/10.1007/s11948-016-9785-y$$EPDF$$P50$$Gspringer$$H</linktopdf><linktohtml>$$Uhttps://link.springer.com/10.1007/s11948-016-9785-y$$EHTML$$P50$$Gspringer$$H</linktohtml><link.rule.ids>314,776,780,27901,27902,41464,42533,51294</link.rule.ids><backlink>$$Uhttps://www.ncbi.nlm.nih.gov/pubmed/27116039$$D View this record in MEDLINE/PubMed$$Hfree_for_read</backlink></links><search><creatorcontrib>Miller, Keith W.</creatorcontrib><creatorcontrib>Wolf, Marty J.</creatorcontrib><creatorcontrib>Grodzinsky, Frances</creatorcontrib><title>This “Ethical Trap” Is for Roboticists, Not Robots: On the Issue of Artificial Agent Ethical Decision-Making</title><title>Science and engineering ethics</title><addtitle>Sci Eng Ethics</addtitle><addtitle>Sci Eng Ethics</addtitle><description>In this paper we address the question of when a researcher is justified in describing his or her artificial agent as demonstrating ethical decision-making. The paper is motivated by the amount of research being done that attempts to imbue artificial agents with expertise in ethical decision-making. It seems clear that computing systems make decisions, in that they make choices between different options; and there is scholarship in philosophy that addresses the distinction between ethical decision-making and general decision-making. Essentially, the qualitative difference between ethical decisions and general decisions is that ethical decisions must be part of the process of developing ethical expertise within an agent. We use this distinction in examining publicity surrounding a particular experiment in which a simulated robot attempted to safeguard simulated humans from falling into a hole. We conclude that any suggestions that this simulated robot was making ethical decisions were misleading.</description><subject>Agents (artificial intelligence)</subject><subject>Bioethics</subject><subject>Biomedical Engineering and Bioengineering</subject><subject>Computer simulation</subject><subject>Decision making</subject><subject>Decision Making - ethics</subject><subject>Decisions</subject><subject>Education</subject><subject>Engineering</subject><subject>Ethics</subject><subject>Falling</subject><subject>Humans</subject><subject>Medicine/Public Health</subject><subject>Original Paper</subject><subject>Philosophy</subject><subject>Philosophy of Science</subject><subject>Researchers</subject><subject>Robotics</subject><subject>Robotics - ethics</subject><subject>Robotics - standards</subject><subject>Robots</subject><issn>1353-3452</issn><issn>1471-5546</issn><fulltext>true</fulltext><rsrctype>article</rsrctype><creationdate>2017</creationdate><recordtype>article</recordtype><sourceid>EIF</sourceid><sourceid>BENPR</sourceid><recordid>eNqNkcFuVCEUhkmjaWv1AboxJG5ciB7gwoXuJrVqk2oTM64Jw3BnaGcuI3AXs-uD6Mv1SWRy28aYNHXFCef7P0J-hI4pvKcA7YdMqW4UASqJbpUg2z10SJuWEiEa-azOXHDCG8EO0IucrwCYUI3cRwespVQC14doM12GjG9vfp2VZXB2hafJbm5vfuPzjLuY8Pc4iyW4kEt-h7_FMl7kE3zZ47L0FcuDx7HDk1RCV8GqmCx8X_C98KOv6RB78tVeh37xEj3v7Cr7V3fnEfrx6Wx6-oVcXH4-P51cENdwVUijnaZCNlRbNRNKdoy34KSz1nKwwikJQs6pBeYU9621lRNczXXnWy205Efo7ejdpPhz8LmYdcjOr1a293HIhmrQUgkm4GlUaa4B5P9YFZNSKs126Jt_0Ks4pL7-uVIKONdKsUrRkXIp5px8ZzYprG3aGgpmV7IZSza1ZLMr2Wxr5vWdeZit_fwhcd9qBdgI5LrqFz799fSj1j9UObF1</recordid><startdate>20170401</startdate><enddate>20170401</enddate><creator>Miller, Keith W.</creator><creator>Wolf, Marty J.</creator><creator>Grodzinsky, Frances</creator><general>Springer Netherlands</general><general>Springer Nature B.V</general><scope>CGR</scope><scope>CUY</scope><scope>CVF</scope><scope>ECM</scope><scope>EIF</scope><scope>NPM</scope><scope>AAYXX</scope><scope>CITATION</scope><scope>3V.</scope><scope>7TB</scope><scope>7X2</scope><scope>7X7</scope><scope>7XB</scope><scope>88E</scope><scope>88I</scope><scope>8AO</scope><scope>8FD</scope><scope>8FE</scope><scope>8FG</scope><scope>8FH</scope><scope>8FI</scope><scope>8FJ</scope><scope>8FK</scope><scope>AABKS</scope><scope>ABJCF</scope><scope>ABSDQ</scope><scope>ABUWG</scope><scope>AEUYN</scope><scope>AFKRA</scope><scope>ATCPS</scope><scope>AZQEC</scope><scope>BBNVY</scope><scope>BENPR</scope><scope>BGLVJ</scope><scope>BHPHI</scope><scope>BKSAR</scope><scope>C18</scope><scope>CCPQU</scope><scope>DWQXO</scope><scope>FR3</scope><scope>FYUFA</scope><scope>GHDGH</scope><scope>GNUQQ</scope><scope>HCIFZ</scope><scope>K9.</scope><scope>KR7</scope><scope>L6V</scope><scope>LK8</scope><scope>M0K</scope><scope>M0S</scope><scope>M1P</scope><scope>M2P</scope><scope>M7P</scope><scope>M7S</scope><scope>PATMY</scope><scope>PCBAR</scope><scope>PQEST</scope><scope>PQQKQ</scope><scope>PQUKI</scope><scope>PTHSS</scope><scope>PYCSY</scope><scope>Q9U</scope><scope>7X8</scope></search><sort><creationdate>20170401</creationdate><title>This “Ethical Trap” Is for Roboticists, Not Robots: On the Issue of Artificial Agent Ethical Decision-Making</title><author>Miller, Keith W. ; Wolf, Marty J. ; Grodzinsky, Frances</author></sort><facets><frbrtype>5</frbrtype><frbrgroupid>cdi_FETCH-LOGICAL-c438t-49c9156419a8b586f2370c6caaa30a5c86056d1a02c83e7aa9a8538d9fe795963</frbrgroupid><rsrctype>articles</rsrctype><prefilter>articles</prefilter><language>eng</language><creationdate>2017</creationdate><topic>Agents (artificial intelligence)</topic><topic>Bioethics</topic><topic>Biomedical Engineering and Bioengineering</topic><topic>Computer simulation</topic><topic>Decision making</topic><topic>Decision Making - ethics</topic><topic>Decisions</topic><topic>Education</topic><topic>Engineering</topic><topic>Ethics</topic><topic>Falling</topic><topic>Humans</topic><topic>Medicine/Public Health</topic><topic>Original Paper</topic><topic>Philosophy</topic><topic>Philosophy of Science</topic><topic>Researchers</topic><topic>Robotics</topic><topic>Robotics - ethics</topic><topic>Robotics - standards</topic><topic>Robots</topic><toplevel>peer_reviewed</toplevel><toplevel>online_resources</toplevel><creatorcontrib>Miller, Keith W.</creatorcontrib><creatorcontrib>Wolf, Marty J.</creatorcontrib><creatorcontrib>Grodzinsky, Frances</creatorcontrib><collection>Medline</collection><collection>MEDLINE</collection><collection>MEDLINE (Ovid)</collection><collection>MEDLINE</collection><collection>MEDLINE</collection><collection>PubMed</collection><collection>CrossRef</collection><collection>ProQuest Central (Corporate)</collection><collection>Mechanical &amp; Transportation Engineering Abstracts</collection><collection>Agricultural Science Collection</collection><collection>Health &amp; Medical Collection</collection><collection>ProQuest Central (purchase pre-March 2016)</collection><collection>Medical Database (Alumni Edition)</collection><collection>Science Database (Alumni Edition)</collection><collection>ProQuest Pharma Collection</collection><collection>Technology Research Database</collection><collection>ProQuest SciTech Collection</collection><collection>ProQuest Technology Collection</collection><collection>ProQuest Natural Science Collection</collection><collection>Hospital Premium Collection</collection><collection>Hospital Premium Collection (Alumni Edition)</collection><collection>ProQuest Central (Alumni) (purchase pre-March 2016)</collection><collection>Philosophy Collection</collection><collection>Materials Science &amp; Engineering Collection</collection><collection>Philosophy Database</collection><collection>ProQuest Central (Alumni Edition)</collection><collection>ProQuest One Sustainability</collection><collection>ProQuest Central UK/Ireland</collection><collection>Agricultural &amp; Environmental Science Collection</collection><collection>ProQuest Central Essentials</collection><collection>Biological Science Collection</collection><collection>ProQuest Central</collection><collection>Technology Collection (ProQuest)</collection><collection>Natural Science Collection</collection><collection>Earth, Atmospheric &amp; Aquatic Science Collection</collection><collection>Humanities Index</collection><collection>ProQuest One Community College</collection><collection>ProQuest Central Korea</collection><collection>Engineering Research Database</collection><collection>Health Research Premium Collection</collection><collection>Health Research Premium Collection (Alumni)</collection><collection>ProQuest Central Student</collection><collection>SciTech Premium Collection</collection><collection>ProQuest Health &amp; Medical Complete (Alumni)</collection><collection>Civil Engineering Abstracts</collection><collection>ProQuest Engineering Collection</collection><collection>ProQuest Biological Science Collection</collection><collection>Agricultural Science Database</collection><collection>Health &amp; Medical Collection (Alumni Edition)</collection><collection>Medical Database</collection><collection>Science Database</collection><collection>Biological Science Database</collection><collection>Engineering Database</collection><collection>Environmental Science Database</collection><collection>Earth, Atmospheric &amp; Aquatic Science Database</collection><collection>ProQuest One Academic Eastern Edition (DO NOT USE)</collection><collection>ProQuest One Academic</collection><collection>ProQuest One Academic UKI Edition</collection><collection>Engineering Collection</collection><collection>Environmental Science Collection</collection><collection>ProQuest Central Basic</collection><collection>MEDLINE - Academic</collection><jtitle>Science and engineering ethics</jtitle></facets><delivery><delcategory>Remote Search Resource</delcategory><fulltext>fulltext</fulltext></delivery><addata><au>Miller, Keith W.</au><au>Wolf, Marty J.</au><au>Grodzinsky, Frances</au><format>journal</format><genre>article</genre><ristype>JOUR</ristype><atitle>This “Ethical Trap” Is for Roboticists, Not Robots: On the Issue of Artificial Agent Ethical Decision-Making</atitle><jtitle>Science and engineering ethics</jtitle><stitle>Sci Eng Ethics</stitle><addtitle>Sci Eng Ethics</addtitle><date>2017-04-01</date><risdate>2017</risdate><volume>23</volume><issue>2</issue><spage>389</spage><epage>401</epage><pages>389-401</pages><issn>1353-3452</issn><eissn>1471-5546</eissn><abstract>In this paper we address the question of when a researcher is justified in describing his or her artificial agent as demonstrating ethical decision-making. The paper is motivated by the amount of research being done that attempts to imbue artificial agents with expertise in ethical decision-making. It seems clear that computing systems make decisions, in that they make choices between different options; and there is scholarship in philosophy that addresses the distinction between ethical decision-making and general decision-making. Essentially, the qualitative difference between ethical decisions and general decisions is that ethical decisions must be part of the process of developing ethical expertise within an agent. We use this distinction in examining publicity surrounding a particular experiment in which a simulated robot attempted to safeguard simulated humans from falling into a hole. We conclude that any suggestions that this simulated robot was making ethical decisions were misleading.</abstract><cop>Dordrecht</cop><pub>Springer Netherlands</pub><pmid>27116039</pmid><doi>10.1007/s11948-016-9785-y</doi><tpages>13</tpages></addata></record>
fulltext fulltext
identifier ISSN: 1353-3452
ispartof Science and engineering ethics, 2017-04, Vol.23 (2), p.389-401
issn 1353-3452
1471-5546
language eng
recordid cdi_proquest_miscellaneous_1909685250
source MEDLINE; SpringerLink Journals - AutoHoldings
subjects Agents (artificial intelligence)
Bioethics
Biomedical Engineering and Bioengineering
Computer simulation
Decision making
Decision Making - ethics
Decisions
Education
Engineering
Ethics
Falling
Humans
Medicine/Public Health
Original Paper
Philosophy
Philosophy of Science
Researchers
Robotics
Robotics - ethics
Robotics - standards
Robots
title This “Ethical Trap” Is for Roboticists, Not Robots: On the Issue of Artificial Agent Ethical Decision-Making
url https://sfx.bib-bvb.de/sfx_tum?ctx_ver=Z39.88-2004&ctx_enc=info:ofi/enc:UTF-8&ctx_tim=2025-02-07T00%3A45%3A44IST&url_ver=Z39.88-2004&url_ctx_fmt=infofi/fmt:kev:mtx:ctx&rfr_id=info:sid/primo.exlibrisgroup.com:primo3-Article-proquest_cross&rft_val_fmt=info:ofi/fmt:kev:mtx:journal&rft.genre=article&rft.atitle=This%20%E2%80%9CEthical%20Trap%E2%80%9D%20Is%20for%20Roboticists,%20Not%20Robots:%20On%20the%20Issue%20of%20Artificial%20Agent%20Ethical%20Decision-Making&rft.jtitle=Science%20and%20engineering%20ethics&rft.au=Miller,%20Keith%20W.&rft.date=2017-04-01&rft.volume=23&rft.issue=2&rft.spage=389&rft.epage=401&rft.pages=389-401&rft.issn=1353-3452&rft.eissn=1471-5546&rft_id=info:doi/10.1007/s11948-016-9785-y&rft_dat=%3Cproquest_cross%3E1909685250%3C/proquest_cross%3E%3Curl%3E%3C/url%3E&disable_directlink=true&sfx.directlink=off&sfx.report_link=0&rft_id=info:oai/&rft_pqid=1880339882&rft_id=info:pmid/27116039&rfr_iscdi=true